Finally running SPARK with IDEA is a success.
There are a few things to note
1. Two packages need to be introduced.
2 Use VM at runtime. Refer to localized operation - Dspark.master=local
3 How to check the command used by idea when it is running? The answer is to look directly.
4 Using IDEA to type out the package will be a lot more stuff. Delete the redundant ones directly.
Spark environment construction notes
Guess you like
Origin http://10.200.1.11:23101/article/api/json?id=326627599&siteId=291194637
Ranking