通过yarn提交任务出现时,报错信息如下:
Failed while trying to construct the redirect url to the log server. Log Server url may not be configured
java.lang.Exception: Unknown container. Container either has not started or has already completed or doesn't belong to this node at all.
提交命令如下:
bin/spark-submit \
--class org.apache.spark.examples.SparkPi \
--master yarn \
--deploy-mode cluster \
/home/hadoop/spark/spark-2.4.5-bin-hadoop2.7/examples/jars/spark-examples_2.11-2.4.5.jar
http://centoshadoop1:8088/cluster/apps 查看任务执行情况如下
点击应用ID: application_1586223718291_0004 进入下一界面:
点击Logs链接进入日志详情:报错如下
解决方案:
cd /home/hadoop/hadoop-ha/hadoop/hadoop-2.8.5/etc/hadoop 修改对应的Hadoop配置文件
(1)出现此问题是由于没有启动historyserver服务,默认情况关闭的,它是一个独立的服务,首先需要配置yarn-site.xml文件,在该配置文件中加入以下配置
<property>
<name>yarn.log.server.url</name>
<value>http://centoshadoop1:19888/jobhistory/logs</value> <!-- centoshadoop1为(Master节点)主节点的ip映射-->
</property>
(2)然后在mapred-site.xml中加入如下配置,端口是在yarn-site.xml中一样,是19888
<property>
<name>mapreduce.jobhistory.address</name>
<value>centoshadoop1:10020</value>
</property>
<property>
<name>mapreduce.jobhistory.webapp.address</name>
<value>centoshadoop1:19888</value>
</property>
(3)分发配置文件到集群中的其它机器
scp -r yarn-site.xml hadoop@centoshadoop2:/home/hadoop/hadoop-ha/hadoop/hadoop-2.8.5/etc/hadoop
scp -r mapred-site.xml hadoop@centoshadoop2:/home/hadoop/hadoop-ha/hadoop/hadoop-2.8.5/etc/hadoop
(4)在master上通过如下命令启动historyserver:
cd /home/hadoop/hadoop-ha/hadoop/hadoop-2.8.5
sbin/mr-jobhistory-daemon.sh start historyserver
日志信息如下:
starting historyserver, logging to /home/hadoop/hadoop-ha/hadoop/hadoop-2.8.5/logs/mapred-hadoop-historyserver-centoshadoop1.out
[hadoop@centoshadoop1 hadoop-2.8.5]$ jps
14051 DataNode
14548 NodeManager
56292 HMaster
56452 HRegionServer
14421 ResourceManager
13926 NameNode
25737 JobHistoryServer ---- 这个就是刚刚启动的jvm进程
14284 JournalNode
8190 RunJar
4399 DFSZKFailoverController
9839 Master
25855 Jps
http://centoshadoop1:19888/jobhistory/ 访问该地址,查看界面如下
(5)在次提交任务,点击日志进去,如下图所示
(6)点击here查看日志详情