Article Directory
Spark pseudo-distributed installation under Ubuntu
Resource acquisition: https://download.csdn.net/download/qq_45797116/15869653
One, install Scala
- Unzip the installation package to the specified directory:
tar -zxvf scala-2.10.4.tgz -C /home/
- Rename the unzipped file:
mv scala-2.10.4/ scala
- Configure environment variables:
vi /etc/profile
export SCALA_HOME=/home/scala
export PATH=$PATH:$SCALA_HOME/bin
- Make the environment variable take effect immediately:
source /etc/profile
- Test whether the installation is successful:
scala
Second, install Spark
- Unzip the installation package to the specified directory:
tar -zxvf spark-2.2.0-bin-hadoop2.7.tgz -C /home/
- Asking price for rename decompression:
mv spark-2.2.0-bin-hadoop2.7/ hadoop
- Configure environment variables:
vi /etc/profile
export SPARK_HOME=/home/spark
export PATH=$PATH:$SPARK_HOME/bin
- Make the environment variable take effect immediately:
source /etc/profile
- Test whether the installation is successful:
spark-shell
Three, configure the Spark file
-
willspark-env.sh.templateRename the file tospark-env.sh:
mv spark-env.sh.template spark-env.sh
-
turn onspark-env.shFile and add the following content:
vi spark-env.sh
# 环境变量配置
export JAVA_HOME=/home/java
export HADOOP_HOME=/home/hadoop
export HADOOPCONF_DIR=/home/hadoop/etc/hadoop
export SCALA_HOME=/home/scala
export SPARK_MASTER_IP=192.168.64.100
export SPARK_MASTER_PORT=7077
Four, test
Before the first test start hadoop: /home/hadoop/sbin/start-all.sh
Start the Spark: /home/spark/sbin/start-all.sh
, contains the master process and worker processes
and then landing http://192.168.64.100:8080/
test:
the state is Active, success!
Refer to the big guy's blog! Convex (`0´) convex base friend high