Spark安装配置03--配置Spark Standalone HA

版权声明:本文为博主原创文章,未经博主允许不得转载。 https://blog.csdn.net/With__Sunshine/article/details/88629785

------------------------------------------------

1.zookeeper正常安装并启动,参看:https://blog.csdn.net/With__Sunshine/article/details/88538888

------------------------------------------------

2.修改spark-env.sh,配置如下

#注释内容如下:
#SPARK_MASTER_HOST=s101
#SPARK_MASTER_PORT=7077

#添加内容如下:
export SPARK_DAEMON_JAVA_OPTS="
 -Dspark.deploy.recoveryMode=ZOOKEEPER
 -Dspark.deploy.zookeeper.url=s101,102,s103,s104
 -Dspark.deploy.zookeeper.dir=/spark"

------------------------------------------------

3.分发文件

cd /soft/spark/conf

scp -r spark-env.sh centos@s102:/soft/spark/conf
scp -r spark-env.sh centos@s103:/soft/spark/conf
scp -r spark-env.sh centos@s104:/soft/spark/conf

------------------------------------------------

4.在s101上启动全部节点

cd /soft/spark/sbin

./start-all.sh

------------------------------------------------

5.在s102上启动master节点

cd /soft/spark/sbin

./start-master.sh

------------------------------------------------

6.spark HA访问集群

/soft/spark/bin/spark-shell \
--master spark://s101:7077,s102:7077 \
--executor-memory 2g \
--total-executor-cores 2

猜你喜欢

转载自blog.csdn.net/With__Sunshine/article/details/88629785