spark如何解决文件不存在_SparkContext错误-找不到文件/tmp/spark事件不存在

  • Post author:
  • Post category:其他


通过API调用运行Python Spark应用程序-

提交应用程序时-响应-失败

SSH到工人中

我的python应用程序存在于/root/spark/work/driver-id/wordcount.py

错误可以在/root/spark/work/driver-id/stderr

显示以下错误-Traceback (most recent call last):

File “/root/wordcount.py”, line 34, in

main()

File “/root/wordcount.py”, line 18, in main

sc = SparkContext(conf=conf)

File “/root/spark/python/lib/pyspark.zip/pyspark/context.py”, line 115, in __init__

File “/root/spark/python/lib/pyspark.zip/pyspark/context.py”, line 172, in _do_init

File “/root/spark/python/lib/pyspark.zip/pyspark/context.py”, line 235, in _initialize_context

File “/root/spark/python/lib/py4j-0.9-src.zip/py4j/java_gateway.py”, line 1064, in __call__

File “/root/spark/python/lib/py4j-0.9-src.zip/py4j/protocol.py”, line 308, in get_return_value

py4j.protoc



版权声明:本文为weixin_39682477原创文章,遵循 CC 4.0 BY-SA 版权协议,转载请附上原文出处链接和本声明。