一:未创建Maven工程,所需jar包都是自己一一找的,结果碰到许多缺包问题,最后我的lib下包为:
二:尝试获取HiveConnection,先附上成功获取HiveConnection的代码:
public class Test01Hive {
private static String driverName = "org.apache.hive.jdbc.HiveDriver";
private static String url = "jdbc:hive2://master:10000/default";
private static String user = "hive";
private static String password = "xujun";
public static void main(String[] args) throws Exception {
Class.forName(driverName);
Connection conn = DriverManager.getConnection(url, user, password);
// org.apache.hive.jdbc.HiveConnection@ebe538
System.out.println( conn );
}
}
问题重现1:
开始定义的
url = “jdbc:hive://master:10000/default”; 结果报如下错误:
java.sql.SQLException:
No suitable driver found for jdbc:hive://master:10000/default
at java.sql.DriverManager.getConnection(DriverManager.java:596)
at java.sql.DriverManager.getConnection(DriverManager.java:215)
at com.berg.hive.test1.api.Test01Hive.getConn(Test01Hive.java:50)
at com.berg.hive.test1.api.Test01Hive.main(Test01Hive.java:37)
问题1解决:
将url更改为:
url = “jdbc:hive2://master:10000/default”;
问题1解决思路来源:
问题重现2:
org.apache.hive.service.cli.HiveSQLException: Failed to open new session:
java.lang.RuntimeException: org.apache.hadoop.ipc.RemoteException
(org.apache.hadoop.security.authorize.AuthorizationException):
User: hadoop is not allowed to impersonate hive
at org.apache.hive.jdbc.Utils.verifySuccess(Utils.java:258)
at org.apache.hive.jdbc.Utils.verifySuccess(Utils.java:249)
at org.apache.hive.jdbc.HiveConnection.openSession(HiveConnection.java:579)
at org.apache.hive.jdbc.HiveConnection.<init>(HiveConnection.java:167)
at org.apache.hive.jdbc.HiveDriver.connect(HiveDriver.java:107)
at java.sql.DriverManager.getConnection(DriverManager.java:571)
at java.sql.DriverManager.getConnection(DriverManager.java:215)
at com.berg.hive.test1.api.Test01Hive.getConn(Test01Hive.java:50)
at com.berg.hive.test1.api.Test01Hive.main(Test01Hive.java:38)
Caused by: org.apache.hive.service.cli.HiveSQLException: Failed to open new session:
java.lang.RuntimeException: org.apache.hadoop.ipc.RemoteException
(org.apache.hadoop.security.authorize.AuthorizationException): User: hadoop is not allowed to
impersonate hive
问题2解决:
进入hadoop安装目录下,然后切换至etc/hadoop即hadoop-2.6.4/etc/hadoop,修改
core-site.xml
中文件内容,在原文件内容中添加:
<property>
<name>hadoop.proxyuser.hadoop.hosts</name>
<value>*</value>
</property>
<property>
<name>hadoop.proxyuser.hadoop.groups</name>
<value>*</value>
</property>
我的修改后是:
<configuration>
<property>
<name>fs.defaultFS</name>
<value>hdfs://master:9000</value>
</property>
<property>
<name>hadoop.tmp.dir</name>
<value>file:/mysoftware/hadoop-2.6.4/tmp</value>
</property>
<property>
<name>hadoop.native.lib</name>
<value>false</value>
</property>
<property>
<name>hadoop.proxyuser.hadoop.hosts</name>
<value>*</value>
</property>
<property>
<name>hadoop.proxyuser.hadoop.groups</name>
<value>*</value>
</property>
</configuration>
另外几个链接:
1. 重启mysql服务:
http://wwwlouxuemingcom.blog.163.com/blog/static/209747822013411103950266/
2. Could not open connection to jdbc
3.Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient
………………..
转载于:https://my.oschina.net/gently/blog/683604