Centos系统安装spark单机版

版权声明:本文为博主原创文章,未经博主允许不得转载。 https://blog.csdn.net/u010736419/article/details/79346136

1, 配置JAVA_HOME

通过yum等方式,可以安装最新的java版本(yum install java)。安装完java后,我们需要配置JAVA_HOME,首先要确定jdk安装的位置:

[root@bogon marshall]# which java 

/usr/bin/java

[root@bogon marshall]#  

[root@bogon marshall]# ls -lrt/usr/bin/java 

lrwxrwxrwx. 1 root root 22 Feb 11 19:33/usr/bin/java -> /etc/alternatives/java

[root@bogon marshall]#  

[root@bogon marshall]# ls -lrt/etc/alternatives/java

lrwxrwxrwx. 1 root root 73 Feb 11 19:33/etc/alternatives/java ->/usr/lib/jvm/java-1.8.0-openjdk-1.8.0.161-0.b14.el7_4.x86_64/jre/bin/java

然后编辑/etc/profile 文件,加入:

export SCALA_HOME=/opt/scala/scala-2.12.4

exportSPARK_HOME=/opt/spark-2.2.1-bin-hadoop2.7

exportJAVA_HOME=/usr/lib/jvm/java-1.8.0-openjdk-1.8.0.161-0.b14.el7_4.x86_64

export JRE_HOME=$JAVA_HOME/jre

exportCLASSPATH=$JAVA_HOME/lib:$JRE_HOME/lib:$CLASSPATH

PATH=$PATH:${SCALA_HOME}/bin:${SPARK_HOME}/bin:$JAVA_HOME/bin:$JRE_HOME/bin

export PATH

然后运行下面命令,配置生效。

source /etc/profile

验证配置:

[root@bogon opt]# java -version

openjdk version "1.8.0_161"

OpenJDK Runtime Environment (build1.8.0_161-b14)

OpenJDK 64-Bit Server VM (build 25.161-b14,mixed mode)

2 安装scala

下载安装包:http://www.scala-lang.org/download/

运行命令解压缩,这里,我将安装文件都放入了/opt/文件下,执行解压缩命令:

tar –xvf scala-2.12.4

配置见java_home章节

验证配置:

[root@bogon opt]# scala -version

Scala code runner version 2.12.4 --Copyright 2002-2017, LAMP/EPFL and Lightbend, Inc.

3 安装spark

下载安装包:https://www.apache.org/dyn/closer.lua/spark/spark-2.2.1/spark-2.2.1-bin-hadoop2.7.tgz

然后解压缩到/opt/目录。Profile配置见java_home部分,切换到spark安装目录的conf目录下:

cp –rp  slaves.template slaves

cp –rp  spark-env.sh.template spark-env.sh

编辑slaves:

localhost

编辑 spark-env.sh

export SCALA_HOME=/opt/scala/scala-2.12.4

exportSPARK_HOME=/opt/spark-2.2.1-bin-hadoop2.7

exportJAVA_HOME=/usr/lib/jvm/java-1.8.0-openjdk-1.8.0.161-0.b14.el7_4.x86_64

export SPARK_MASTER_IP=hserver1

export SPARK_EXECUTOR_MEMORY=1G

测试:

cd /opt/spark-2.2.1-bin-hadoop2.7

./bin/run-example   SparkPi  10

编译完成后,输入命令:

[root@bogon spark-2.2.1-bin-hadoop2.7]#./bin/spark-shell

Using Spark's default log4j profile:org/apache/spark/log4j-defaults.properties

Setting default log level to"WARN".

To adjust logging level usesc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).

18/02/21 19:05:27 WARN NativeCodeLoader:Unable to load native-hadoop library for your platform... using builtin-javaclasses where applicable

Spark context Web UI available athttp://192.168.44.152:4040

Spark context available as 'sc' (master =local[*], app id = local-1519268738326).

Spark session available as 'spark'.

Welcome to

     ____              __

    / __/__  ___ _____/ /__

   _\ \/ _ \/ _ `/ __/  '_/

  /___/ .__/\_,_/_/ /_/\_\   version2.2.1

     /_/

        

Using Scala version 2.11.8 (OpenJDK 64-BitServer VM, Java 1.8.0_161)

Type in expressions to have them evaluated.

Type :help for more information.

scala>

猜你喜欢

转载自blog.csdn.net/u010736419/article/details/79346136