在Windows启动pyspark shell:Failed to find Spark jars directory. You need to build Spark before running this program

D:\Develop tools\spark-2.2.0-bin-hadoop2.7\bin>pyspark2.cmd

'tools\spark-2.2.0-bin-hadoop2.7\bin\..\jars""\' 不是内部或外部命令,也不是可运
行的程序
或批处理文件。
Failed to find Spark jars directory.
You need to build Spark before running this program.

错误原因:路径中含有空格(D:\Develop tools\spark-2.2.0-bin-hadoop2.7\bin的Develop tools中间有空格)

猜你喜欢

转载自www.cnblogs.com/144823836yj/p/11275408.html