Pyspark 不支持 cluster 模式

1
/app/spark/bin/spark-submit --master spark://10.0.0.46:7077  --deploy-mode cluster --driver-memory 2g --executor-memory 4g --executor-cores 2 /home/spark/aldstat_page_view.py 1

Error: Cluster deploy mode is currently not supported for python applications on standalone clusters.

改为本地模式:

1
/app/spark/bin/spark-submit --master spark://10.0.0.46:7077  --deploy-mode client --driver-memory 2g --executor-memory 4g --executor-cores 2 /home/spark/aldstat_page_view.py 1