我正试图通过Spyder运行pyspark

这是完整的错误:

回溯(最近一次调用最后一次):文件“”,第1行,sc = SparkContext('local',conf = conf)文件“C:\ Users \ ashish.dang \ Documents \ Softwares \ spark-2.1.0- bin-hadoop2.7 \ spark-2.1.0-bin-hadoop2.7 \ python \ lib \ pyspark.zip \ pyspark \ context.py“,第115行,在init SparkContext._ensure_initialized中(self,gateway = gateway,conf = conf)文件“C:\ Users \ ashish.dang \ Documents \ Softwares \ spark-2.1.0-bin-hadoop2.7 \ spark-2.1.0-bin-hadoop2.7 \ python \ lib \ pyspark.zip \ pyspark \ context.py“,第256行,在_ensure_initialized SparkContext._gateway = gateway或launch_gateway(conf)文件”C:\ Users \ ashish.dang \ Documents \ Softwares \ spark-2.1.0-bin-hadoop2.7 \ spark- 2.1.0-bin-hadoop2.7 \ python \ lib \ pyspark.zip \ pyspark \ java_gateway.py“,第95行,在launch_gateway中引发Exception(”在发送驱动程序之前退出Java网关进程的端口号“)例外:Java在向驱动程序发送其端口号之前退出网关进程

我已经提到了以下问题:hereherehere

这是我正在使用的代码

#Add the following paths to the system path. Please check your installation
#to make sure that these zip files actually exist. The names might change
#as versions change.
sys.path.insert(0,os.path.join(SPARK_HOME,"python"))
sys.path.insert(0,os.path.join(SPARK_HOME,"python","lib"))
sys.path.insert(0,os.path.join(SPARK_HOME,"python","lib","pyspark.zip"))
sys.path.insert(0,os.path.join(SPARK_HOME,"python","lib","py4j-0.10.4-src.zip"))

#Initiate Spark context. Once this is done all other applications can run
from pyspark import SparkContext
from pyspark import SparkConf

# Optionally configure Spark Settings
conf=SparkConf()

我找不到问题的解决方案 . 我尝试更改Spyder中的环境变量('SPARK_HOME','PYSPARK_SUBMIT_ARGS'和'JAVA_HOME')

你能帮我解决这个问题或引导我走向正确的方向吗?

谢谢