Spark单节点docker适配

Spark单节点docker适配

参考连接:

(1)docker搭建Spark:
https://blog.csdn.net/qq_33517844/article/details/92255759
(2)docker-compose安装:
Step1:
curl -L https://get.daocloud.io/docker/compose/releases/download/1.22.0/docker-compose-uname -s-uname -m > /usr/local/bin/docker-compose
step2:
chmod +x /usr/local/bin/docker-compose
(3)Docker-compose常用命令
https://www.cnblogs.com/moxiaoan/p/9299404.html
(4)docker入门(三)docker-compose
https://www.jianshu.com/p/7893af4976d9
(5)docker-compose启动容器报 conflicts with network
https://segmentfault.com/q/1010000012390546
(6)docker-compose.yml文件的编写
version: “2”
services:
master:
image: singularities/spark
command: start-spark master
hostname: master
ports:
- “6066:6066”
- “7070:7070”
- “7077:7077” # 此处注意! 在(1)网页中写的yml文件中是没有该映射的,但实际使用时应将此端口映射开启 此处为spark默认的连接端口
- “8080:8080”
- “50070:50070”
worker:
image: singularities/spark
command: start-spark worker master
environment:
SPARK_WORKER_CORES: 1
SPARK_WORKER_MEMORY: 2g
links:
- master

发布了31 篇原创文章 · 获赞 0 · 访问量 43

猜你喜欢

转载自blog.csdn.net/qq_41685616/article/details/105580248