hive on spark 配置文件

  • Post author:
  • Post category:其他


<property>

<name>hive.execution.engine</name>

<value>spark</value>

</property>

<property>

<name>hive.enable.spark.execution.engine</name>

<value>true</value>

</property>

<property>

<name>spark.home</name>

<value>/data/appcom/spark-2.0.2-bin-hadoop2.7</value>

</property>

<property>

<name>spark.master</name>

<value>yarn-client</value>

</property>

<property>

<name>spark.enentLog.enabled</name>

<value>true</value>

</property>

<property>

<name>spark.enentLog.dir</name>

<value>hdfs://master:9000/spark-logs</value>

</property>

<property>

<name>spark.serializer</name>

<value>org.apache.spark.serializer.KryoSerializer</value>

</property>

<property>

<name>spark.executor.memeory</name>

<value>5g</value>

</property>

<property>

<name>spark.driver.memeory</name>

<value>1g</value>

</property>

<property>

<name>spark.executor.cores</name>

<value>6</value>

</property>

<property>

<name>spark.executor.instances</name>

<value>20</value>

</property>

<property>

<name>spark.default.parallelism</name>

<value>50</value>

</property>

或者直接交互界面set

set spark.master=yarn-client;

set hive.execution.engine=spark;

set spark.eventLog.enabled=true;

set spark.eventLog.dir=hdfs://master:9000/spark-logs;

set spark.executor.memory=6g;

set spark.executor.cores=6;

set spark.executor.instances=40;

set spark.default.parallelism= 50;

set spark.serializer=org.apache.spark.serializer.KryoSerializer;



版权声明:本文为AntKengElephant原创文章,遵循 CC 4.0 BY-SA 版权协议,转载请附上原文出处链接和本声明。