- 17/01/16 16:12:23 WARN Utils: Service 'sparkWorker' could not bind on port 0. Attempting port 1.
- 17/01/16 16:12:23 WARN Utils: Service 'sparkWorker' could not bind on port 0. Attempting port 1.
- 17/01/16 16:12:23 WARN Utils: Service 'sparkWorker' could not bind on port 0. Attempting port 1.
- 17/01/16 16:12:23 WARN Utils: Service 'sparkWorker' could not bind on port 0. Attempting port 1.
- 17/01/16 16:12:23 WARN Utils: Service 'sparkWorker' could not bind on port 0. Attempting port 1.
- 17/01/16 16:12:23 WARN Utils: Service 'sparkWorker' could not bind on port 0. Attempting port 1.
- 17/01/16 16:12:23 WARN Utils: Service 'sparkWorker' could not bind on port 0. Attempting port 1.
- 17/01/16 16:12:23 WARN Utils: Service 'sparkWorker' could not bind on port 0. Attempting port 1.
- 17/01/16 16:12:23 WARN Utils: Service 'sparkWorker' could not bind on port 0. Attempting port 1.
- 17/01/16 16:12:23 WARN Utils: Service 'sparkWorker' could not bind on port 0. Attempting port 1.
- 17/01/16 16:12:23 WARN Utils: Service 'sparkWorker' could not bind on port 0. Attempting port 1.
- 17/01/16 16:12:23 WARN Utils: Service 'sparkWorker' could not bind on port 0. Attempting port 1.
- 17/01/16 16:12:23 WARN Utils: Service 'sparkWorker' could not bind on port 0. Attempting port 1.
- 17/01/16 16:12:23 WARN Utils: Service 'sparkWorker' could not bind on port 0. Attempting port 1.
- 17/01/16 16:12:23 WARN Utils: Service 'sparkWorker' could not bind on port 0. Attempting port 1.
- 17/01/16 16:12:23 WARN Utils: Service 'sparkWorker' could not bind on port 0. Attempting port 1.
- Exception in thread "main" java.net.BindException: 无法指定被请求的地址: Service 'sparkWorker' failed after 16 retries! Consider explicitly setting the appropriate port for the service 'sparkWorker' (for example spark.ui.port for SparkUI) to an available port or increasing spark.port.maxRetries.
- at sun.nio.ch.Net.bind0(Native Method)
- at sun.nio.ch.Net.bind(Net.java:433)
- at sun.nio.ch.Net.bind(Net.java:425)
- at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:223)
spark的配置文件spark-env.sh如下
[html] view plain copy
- export JAVA_HOME=/usr/jdk1.8
- export SCALA_HOME=/usr/hadoop/scala-2.11.4
- export HADOOP_HOME=/usr/hadoop/hadoop2.7.3
- export HADOOP_CONF_DIR=/usr/hadoop/hadoop2.7.3/etc/hadoop
- export SPARK_MASTER_IP=192.168.9.200
- export SPARK_WORKER_MEMORY=1g
- export SPARK_WORKER_CORES=1
- export SPARK_HOME=/usr/hadoop/spark-2.0.2
在后面加上SPARK_LOCAL_IP之后三个Worker正常启动
[html] view plain copy
添加:
- export SPARK_LOCAL_IP=127.0.0.1