Hive数据备份、导出、常用的查询、** by的使用

  • Post author:
  • Post category:其他


数据导出

1:结果数据保存到本地文件中

语法格式

INSERT OVERWRITE LOCAL DIRECTORY directory1
  [ROW FORMAT row_format] [STORED AS file_format] (Note: Only available starting with Hive 0.11.0)
  SELECT ... FROM ...

e g

INSERT OVERWRITE LOCAL DIRECTORY "/opt/datas/rachel"
ROW FORMAT  DELIMITED FILEDS TERMINATED BY '\t'
SELECT * FROM  student

效果如下:

hive (rachel_db_hive)>INSERT OVERWRITE LOCAL DIRECTORY '/opt/datas/rachel'
                     > ROW FORMAT  DELIMITED FIELDS TERMINATED BY '\t'
                     > SELECT * FROM  student;
Total jobs = 1
Launching Job 1 out of 1
Number of reduce tasks is set to 0 since there's no reduce operator
Starting Job = job_1539803280664_0001, Tracking URL = http://bigdata-pro01.rachel.com:8088/proxy/application_1539803280664_0001/
Kill Command = /opt/modules/hadoop-2.5.0/bin/hadoop job  -kill job_1539803280664_0001
Hadoop job information for Stage-1: number of mappers: 1; number of reducers: 0
2018-10-17 15:12:38,372 Stage-1 map = 0%,  reduce = 0%
2018-10-17 15:13:20,287 Stage-1 map = 100%,  reduce = 0%, Cumulative CPU 2.06 sec
MapReduce Total cumulative CPU time: 2 seconds 60 msec
Ended Job = job_1539803280664_0001
Copying data to local directory /opt/datas/rachel
Copying data to local directory /opt/datas/rachel
MapReduce Jobs Launched: 
Job 0: Map: 1   Cumulative CPU: 2.06 sec   HDFS Read: 264 HDFS Write: 43 SUCCESS
Total MapReduce CPU Time Spent: 2 seconds 60 msec
OK
student.userid  student.username
Time taken: 138.288 seconds

[rachel@bigdata-senior02 rachel]$ ll
total 4
-rw-r--r--. 1 rachel rachel 43 Oct 17 15:13 000000_0
[rachel@bigdata-senior02 rachel]$ more  000000_0 
0001    rachel
0002    wiki
0003    lucy
0004    honey

e g

INSERT OVERWRITE LOCAL DIRECTORY "/opt/datas/rachel"
SELECT * FROM  student;

效果如下:

[rachel@bigdata-senior02 rachel]$ more 000000_0 
0001rachel
0002wiki
0003lucy
0004honey

2:将数据保存到HDFS上

INSERT OVERWRITE DIRECTORY directory1
  [ROW FORMAT row_format] [STORED AS file_format] (Note: Only available starting with Hive 0.11.0)
  SELECT ... FROM ...
INSERT OVERWRITE DIRECTORY '/user/rachel/datas'
ROW FORMAT  DELIMITED FIELDS TERMINATED BY '\t'
SELECT * FROM  student

效果:

[rachel@bigdata-pro01 hadoop-2.5.0]$ bin/hadoop fs -text /user/rachel/000000_0;
18/10/17 15:39:47 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
0001rachel
0002wiki
0003lucy
0004honey

3:管道符

bin/hive -e “select * from db_hive.order;” > /opt/datas/rachel/order.txt

Hive中的export/import

备份导出

EX



版权声明:本文为sinat_37513998原创文章,遵循 CC 4.0 BY-SA 版权协议,转载请附上原文出处链接和本声明。