在Windows启动pyspark shell:Failed to find Spark jars directory. You need to build Spark before running this program

D:\Develop tools\spark-2.2.0-bin-hadoop2.7\bin>pyspark2.cmd

'tools \ spark-2.2.0-bin -hadoop2.7 \ bin \ .. \ jars "" \' is not an internal or external command, nor is it can be transported
program line
or batch file.
The Spark to the Find JARs Directory Failed.
By You need to Build the before running the this the Spark Program.

 

Cause: path comprising a space (D: \ Develop tools \ intermediate spark-2.2.0-bin-hadoop2.7 \ bin Develop tools of spaces)

Guess you like

Origin www.cnblogs.com/144823836yj/p/11275408.html