安装HIVE遇到问题整理

  • Post author:
  • Post category:其他


在安装Hive时,经常报异常。

后来采用输出日志的方式来开启hive:hive -hiveconf hive.root.logger=DEBUG,console

这样在执行命令时,就会输出日志,出现异常可以很快定位。

——————————————————————————–

ERROR DataNucleus.Datastore: Error thrown executing CREATE TABLE `SD_PARAMS`

(

`SD_ID` BIGINT NOT NULL,

`PARAM_KEY` VARCHAR(256) BINARY NOT NULL,

`PARAM_VALUE` VARCHAR(4000) BINARY NULL,

CONSTRAINT `SD_PARAMS_PK` PRIMARY KEY (`SD_ID`,`PARAM_KEY`)

) ENGINE=INNODB : Specified key was too long; max key length is 767 bytes

com.mysql.jdbc.exceptions.jdbc4.MySQLSyntaxErrorException: Specified key was too long; max key length is 767 bytes

这是字符集的问题,在数据库上执行 alter database hive character set latin1;来修改字符集,问题解决。

经常也会遇到数据库连接异常,这是mysql无法远程登录造成的,执行语句:GRANT ALL PRIVILEGES ON *.* TO ‘root’@’%’ WITH GRANT OPTION;问题解决。

还有的时候连不上是因为防火墙没有关闭,导致数据库无法连接,关了就没问题了。

—————————————————————————–

创建外部表时,为外部表指定数据所在的hdfs目录,如下:


create external table foo



(


uid bigint,


brand_value string


)


row



format



delimited fields terminated by



'\001'



stored as textfile


location



"/group/tbsc-dev/haowen/temp/shpsrch_bshop_brand_value_ssmerge_1011/"



;

但是在


/group/tbsc-dev/haowen/temp/shpsrch_bshop_brand_value_ssmerge_1011/


目录下不能再有子目录,只能有对应的日志文件,否则会报错:

Failed with exception java.io.IOException:java.io.IOException: Not a file: hdfs://localhost:9000/kdp/log-server/config

13/12/30 11:21:59 ERROR CliDriver: Failed with exception java.io.IOException:java.io.IOException: Not a file: hdfs://localhost:9000/kdp/log-server/config

java.io.IOException: java.io.IOException: Not a file: hdfs://localhost:9000/kdp/log-server/config

at org.apache.hadoop.hive.ql.exec.FetchOperator.getNextRow(FetchOperator.java:551)

at org.apache.hadoop.hive.ql.exec.FetchOperator.pushRow(FetchOperator.java:489)

at org.apache.hadoop.hive.ql.exec.FetchTask.fetch(FetchTask.java:136)

at org.apache.hadoop.hive.ql.Driver.getResults(Driver.java:1471)

at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:271)

at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:216)

at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:413)

at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:781)

at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:675)

at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:614)

at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)

at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)

at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)

at java.lang.reflect.Method.invoke(Method.java:597)

at org.apache.hadoop.util.RunJar.main(RunJar.java:212)

Caused by: java.io.IOException: Not a file: hdfs://localhost:9000/kdp/log-server/config

at org.apache.hadoop.mapred.FileInputFormat.getSplits(FileInputFormat.java:277)

at org.apache.hadoop.hive.ql.exec.FetchOperator.getRecordReader(FetchOperator.java:381)

at org.apache.hadoop.hive.ql.exec.FetchOperator.getNextRow(FetchOperator.java:515)

… 14 more

————————————————————————————————————

hive 用PUT命令将本地文件传到hdfs系统中时,默认是不能同名覆盖的。

在指定目录下的文件,如果文件的字段少于表的字段,则后面的字段被设为NULL,如果多余表的字段,那么多余的被省略。如下:

select * from sign_phone;


FullAOSPonMaguro        100324  v1.0.20131223   com.demo        460027014808984 351565054850927 123     NULL

FullAOSPonMaguro        100324  v1.0.20131223   com.demo        460027014808984 351565054850927 1234    NULL

FullAOSPonMaguro        100324  v1.0.20131223   com.demo        460027014808984 351565054850927 12345   NULL

FullAOSPonMaguro        100324  v1.0.20131223   com.demo        460027014808984 351565054850927 123456  NULL

FullAOSPonMaguro        100324  v1.0.20131223   com.demo        460027014808984 351565054850927 1234567 NULL




FullAOSPonMaguro        100324  v1.0.20131223   com.demo        460027014808984 351565054850927 123     1

FullAOSPonMaguro        100324  v1.0.20131223   com.demo        460027014808984 351565054850927 1234    1

FullAOSPonMaguro        100324  v1.0.20131223   com.demo        460027014808984 351565054850927 12345   1

FullAOSPonMaguro        100324  v1.0.20131223   com.demo        460027014808984 351565054850927 123456  1

FullAOSPonMaguro        100324  v1.0.20131223   com.demo        460027014808984 351565054850927 1234567 1


FullJellyBeanonMako     100324  v1.0.20131223   com.demo        460027014808983 355136052576686 15101127498     4

FullJellyBeanonMako     100324  v1.0.20131223   com.demo        460027014808983 355136052576686 18701489421     1

FullJellyBeanonMako     100324  v1.0.20131223   com.demo        460027014808983 355136052576686 18701489421 18701489421 18701489421     1

FullJellyBeanonMako     100324  v1.0.20131223   com.demo        460027014808983 355136052576686 18701489421 18701489421 18701489421 15101127498 1

FullJellyBeanonMako     100324  v1.0.20131223   com.demo        460027014808983 355136052576686 01092394876     2

绿色的是比表字段少的日志

黑色是多的,红色的是正对应的。



版权声明:本文为aaa1117a8w5s6d原创文章,遵循 CC 4.0 BY-SA 版权协议,转载请附上原文出处链接和本声明。