【spark】pyspark错误记录

参考:https://segmentfault.com/q/1010000017001524

出错的Log:

Traceback (most recent call last):
  File "/Users/dingguangwei03/Documents/kuaishou-python/spark-test/test5.py", line 16, in <module>
    sc = SparkContext(conf=conf)
  File "/Users/dingguangwei03/venv/lib/python2.7/site-packages/pyspark/context.py", line 118, in __init__
    conf, jsc, profiler_cls)
  File "/Users/dingguangwei03/venv/lib/python2.7/site-packages/pyspark/context.py", line 195, in _do_init
    self._encryption_enabled = self._jvm.PythonUtils.getEncryptionEnabled(self._jsc)
  File "/Users/dingguangwei03/venv/lib/python2.7/site-packages/py4j/java_gateway.py", line 1487, in __getattr__
    "{0}.{1} does not exist in the JVM".format(self._fqn, name))
py4j.protocol.Py4JError: org.apache.spark.api.python.PythonUtils.getEncryptionEnabled does not exist in the JVM

运行着好好的,忽然就报了这个错,搜索到了文首链接,然后添加了下面两行,就好了

import findspark
findspark.init()

猜你喜欢

转载自blog.csdn.net/qq_30141957/article/details/86694704