Docker安装Spark集群(干净不含hadoop等)

  https://github.com/mvillarrealb/docker-spark-cluster

1:安装前准备

  • Docker 安装好

  • Docker compose 安装好

2:构建镜像

 将 docker-spark-cluster

下载下来放在/opt 目录下 进入docker-spark-clouder目录下 执行

chmod +x build-images.sh
./build-images.sh

 接下来将会下载以下镜像

  • spark-base:2.3.1: A base image based on java:alpine-jdk-8 wich ships scala, python3 and spark 2.3.1

  • spark-master:2.3.1: A image based on the previously created spark image, used to create a spark master containers.

  • spark-worker:2.3.1: A image based on the previously created spark image, used to create spark worker containers.

  • spark-submit:2.3.1: A image based on the previously created spark image, used to create spark submit containers(run, deliver driver and die gracefully).

 3:执行docker-compose

   执行  docker-compose up

    接下来将会创建集群

4:验证集群

Spark Master

http://10.5.0.2:8080/

alt text

Spark Worker 1

http://10.5.0.3:8081/

alt text

Spark Worker 2

http://10.5.0.4:8081/

alt text

Spark Worker 3

http://10.5.0.5:8081/

alt text

猜你喜欢

转载自blog.csdn.net/zhanaolu4821/article/details/86312305
今日推荐