derby 的配置及jdbc连接

  • Post author:
  • Post category:其他


hive-default.xml

<property>

<name>javax.jdo.option.ConnectionURL</name>

<value>jdbc:derby:;databaseName=metastore_db;create=true</value> 表示使用嵌入式的derby,create为true表示自动创建数据库,数据库名为metastore_db

<!–<value>jdbc:derby://192.168.0.3:4567/hadoopor;create=true</value>–> 表示使用客服模式的derby,hadoopor为数据库名,192.168.0.3为derby服务端的IP地址,而4567为服务端的端口号

<description>JDBC connect string for a JDBC metastore</description>

</property>

<property>

<name>javax.jdo.option.ConnectionDriverName</name>

<value>org.apache.derby.jdbc.EmbeddedDriver</value> 表示使用嵌入式的derby

<!–<value>org.apache.derby.jdbc.ClientDriver</value>–> 表示使用客服模式的derby

<description>Driver class name for a JDBC metastore</description>

</property>

对于嵌入式的derby要求在hive的lib目录下有文件derby.jar,而对于客服模式的derby要求有derbyclient.jar文件

如果是derby坏了,就得把metastore_db删除就好了,不过以前的数据也没了,我觉得测试的时候用derby还行,如果正式上线的话就不要启动嵌入式的了,直接启动并连接线上服务器就ok了。不然metastore_db一加锁,启动了hive –service hiveserver就不能启动hive 启动了hive就不能启动hive –service hiveserver。

说明:

测试的时候使用嵌入式还可以,正式环境一定要用服务端模式,否则出了问题就没法恢复了。

可以选择任何你熟悉的语言类作为JDBC连接:

import java.sql.SQLException;

import java.sql.Connection;

import java.sql.ResultSet;

import java.sql.Statement;

import java.sql.DriverManager;

public class HiveJdbcClient {

private static String driverName = “org.apache.hadoop.hive.jdbc.HiveDriver”;

/**

* @param args

* @throws SQLException

*/

public static void main(String[] args) throws SQLException {

try {

Class.forName(driverName);

} catch (ClassNotFoundException e) {

// TODO Auto-generated catch block

e.printStackTrace();

System.exit(1);

}

Connection con = DriverManager.getConnection(“jdbc:hive://localhost:10000/default”, “”, “”);

Statement stmt = con.createStatement();

String tableName = “testHiveDriverTable”;

stmt.executeQuery(“drop table ” + tableName);

ResultSet res = stmt.executeQuery(“create table ” + tableName + ” (key int, value string)”);

// show tables

String sql = “show tables ‘” + tableName + “‘”;

System.out.println(“Running: ” + sql);

res = stmt.executeQuery(sql);

if (res.next()) {

System.out.println(res.getString(1));

}

// describe table

sql = “describe ” + tableName;

System.out.println(“Running: ” + sql);

res = stmt.executeQuery(sql);

while (res.next()) {

System.out.println(res.getString(1) + “\t” + res.getString(2));

}

// load data into table

// NOTE: filepath has to be local to the hive server

// NOTE: /tmp/a.txt is a ctrl-A separated file with two fields per line

String filepath = “/tmp/a.txt”;

sql = “load data local inpath ‘” + filepath + “‘ into table ” + tableName;

System.out.println(“Running: ” + sql);

res = stmt.executeQuery(sql);

// select * query

sql = “select * from ” + tableName;

System.out.println(“Running: ” + sql);

res = stmt.executeQuery(sql);

while (res.next()) {

System.out.println(String.valueOf(res.getInt(1)) + “\t” + res.getString(2));

}

// regular hive query

sql = “select count(1) from ” + tableName;

System.out.println(“Running: ” + sql);

res = stmt.executeQuery(sql);

while (res.next()) {

System.out.println(res.getString(1));

}

}

}

接下来做的工作即是运行了:

# Then on the command-line

$ javac HiveJdbcClient.java

# To run the program in standalone mode, we need the following jars in the classpath

# from hive/build/dist/lib

# hive_exec.jar

# hive_jdbc.jar

# hive_metastore.jar

# hive_service.jar

# libfb303.jar

# log4j-1.2.15.jar

#

# from hadoop/build

# hadoop-*-core.jar

#

# To run the program in embedded mode, we need the following additional jars in the classpath

# from hive/build/dist/lib

# antlr-runtime-3.0.1.jar

# derby.jar

# jdo2-api-2.1.jar

# jpox-core-1.2.2.jar

# jpox-rdbms-1.2.2.jar

#

# as well as hive/build/dist/conf

$ java -cp $CLASSPATH HiveJdbcClient

# Alternatively, you can run the following bash script, which will seed the data file

# and build your classpath before invoking the client.

#!/bin/bash

HADOOP_HOME=/your/path/to/hadoop

HIVE_HOME=/your/path/to/hive

echo -e ‘1\x01foo’ > /tmp/a.txt

echo -e ‘2\x01bar’ >> /tmp/a.txt

HADOOP_CORE={

{ls $HADOOP_HOME/hadoop-*-core.jar}}

CLASSPATH=.:$HADOOP_CORE:$HIVE_HOME/conf

for i in ${HIVE_HOME}/lib/*.jar ; do

CLASSPATH=$CLASSPATH:$i

done

java -cp $CLASSPATH HiveJdbcClient

http://www.iteye.com/topic/1113849



版权声明:本文为u013081111原创文章,遵循 CC 4.0 BY-SA 版权协议,转载请附上原文出处链接和本声明。