spark 终止 运行_停止正在运行的Spark应用程序

  • Post author:
  • Post category:其他


I’m running a Spark cluster in standalone mode.

I’ve submitted a Spark application in cluster mode using options:

–deploy-mode cluster –supervise

So that the job is fault tolerant.

Now I need to keep the cluster running but stop the application from running.

Things I have tried:

Stopping the cluster and restarting it. But the application resumes

execution when I do that.

Used Kill -9 of a daemon named DriverWrapper but the job resumes again after that.

I’ve also removed temporary files and directories and restarted the cluster but the job resumes again.

So the running application is really fault tolerant.

Question:

Based on the above scenario can someone suggest how I can stop the job from running or what else I can try to stop the application from running but keep the cluster running.

Something just accrued to me, if I call sparkContext.stop() that should do it but that requires a bit of work in the code which is OK but can you suggest any other way without code change.

解决方案

If you wish to kill an application that is failing repeatedly, you may do so through:

./bin/spark-class org.apache.spark.deploy.Client kill

You can find the driver ID through the standalone Master web UI at http://:8080.



版权声明:本文为weixin_39960319原创文章,遵循 CC 4.0 BY-SA 版权协议,转载请附上原文出处链接和本声明。