Spark cluster startup steps and web ui view

Cluster startup steps: first start the HDFS system, then start the spark cluster, and finally submit the jar to the spark cluster for execution.

1.hadoop start
cd
/home/***/hadoop-2.7.4/sbin start-all.sh

3.spark start
cd
/home/***/spark-2.2.0/sbin start-all.sh

4.spark提交
cd /home/***/spark-2.2.0/bin
spark-submit --master local --class com.helloworld.kmeans /home/***/xsd11.jar

web ui login view method

1. View hadoop UI
http://192.168.1.***:50070

All Applications 界面(yarn UI:8088)
http://192.168.1.***:8088/cluster

2. View sparkUI (start ./bin/spark-shell first)

Cluster mode: 18088
Stand-alone mode: 4044
Log server
7077

Summary: If you want to view the job execution on the UI interface, you need to remove sc.stop() from the spark program ,spark UI (4040, etc.) interface can be opened

 

Guess you like

Origin http://43.154.161.224:23101/article/api/json?id=324516950&siteId=291194637