Winter vacation learning progress -2

Spark of installation and use

Installing a spark-2.1.0-bin-without-hadoop.tgz

Spark modify the configuration file spark-env.sh

Added configuration information:

export SPARK_DIST_CLASSPATH=$(/usr/local/hadoop/bin/hadoop classpath)

Spark of use

In the terminal to switch to the first directory Spark

cd /usr/local/spark

After passing through the input

bin / spark-shell into edit mode

If you need to use Hadoop HDFS also need to start

Read local files

val textFile=sc.textFile("file:///home/hadoop/test.txt") 

HDFS reading system files in spark-shell

val textFile=sc.textFile("hdfs://localhost:9000/user/hadoop/test.txt") 

Guess you like

Origin www.cnblogs.com/liujinxin123/p/12203161.html