解决 ERROR spark.SparkContext: Error initializing SparkContext

The following error was reported when starting spark-shell today:

Insert picture description here
Judging from the error log, it can be seen that the SparkContext initialization is abnormal, and the connection fails when communicating with the hadoop port. At the beginning, I checked the spark process that was started and found that there was no problem.

[root@hadoop102 logs]# showjps.sh 
===================== root@hadoop102 =======================
7315 Worker
7395 Jps
7237 Master
===================== root@hadoop103 =======================
7277 Jps
7199 Worker
===================== root@hadoop104 =======================
7202 Worker
7272 Jps

Originally, I didn't want to start Hadoop, but since the log says that Hadoop cannot be connected, I had to start Hadoop. Sure enough, I reopened spark-shell after it was turned on. Since no error was reported, I went to find the spark that was configured soon Configuration information in Standalone mode, I found that I made changes to the spark-defaults.conf file in the spark/conf directory because I need to configure the spark history server, and the value of the spark.eventLog.dir property in my history server I wrote hadoop, so every time I start spark's submit mode or shell window, the log will be written in hadoop's hdfs, so the above error can be solved in the following two ways.

1. Spark history server logs are configured on the local Linux file system

2. The hdfs file system is still used as the place to write logs, but hadoop needs to be turned on before starting spark

Guess you like

Origin blog.csdn.net/weixin_44080445/article/details/110137476