[spark] pyspark error record

Reference: https://segmentfault.com/q/1010000017001524

Error Log:

Traceback (most recent call last):
  File "/Users/dingguangwei03/Documents/kuaishou-python/spark-test/test5.py", line 16, in <module>
    sc = SparkContext(conf=conf)
  File "/Users/dingguangwei03/venv/lib/python2.7/site-packages/pyspark/context.py", line 118, in __init__
    conf, jsc, profiler_cls)
  File "/Users/dingguangwei03/venv/lib/python2.7/site-packages/pyspark/context.py", line 195, in _do_init
    self._encryption_enabled = self._jvm.PythonUtils.getEncryptionEnabled(self._jsc)
  File "/Users/dingguangwei03/venv/lib/python2.7/site-packages/py4j/java_gateway.py", line 1487, in __getattr__
    "{0}.{1} does not exist in the JVM".format(self._fqn, name))
py4j.protocol.Py4JError: org.apache.spark.api.python.PythonUtils.getEncryptionEnabled does not exist in the JVM

It was running well, but suddenly reported this error, searched for the link at the beginning of the article, and then added the following two lines, just fine

import findspark
findspark.init()

Guess you like

Origin blog.csdn.net/qq_30141957/article/details/86694704