spark on yarn cluster查看日志

spark on yarn cluster查看日志

[hadoop@hadoop001 shell]$ yarn logs -applicationId application_1420997455428_0005
15/01/12 04:34:51 INFO client.RMProxy: Connecting to ResourceManager at /0.0.0.0:8032
/tmp/logs/hadoop/logs/application_1420997455428_0005does not exist.
Log aggregation has not completed or is not enabled.
原因是聚合日志没有开启

解决如下:

[hadoop@hadoop001 hadoop]$ vi yarn-site.xml
  <!--开启聚合日志--> 
    <property>
        <name>yarn.log-aggregation-enable</name>
        <value>true</value>
    </property>
      <!--设置聚合日志放置的位置--> ///tmp/logs是hdfs分布式系统上面的,自己创建自定义的文件夹
    <property>
        <name>yarn.nodemanager.remote-app-log-dir</name>
        <value>/tmp/logs</value>
    </property>
    <!--设置聚合日志存放的时间,单位为秒,会自动移除-->
    <property>
        <name>yarn.log-aggregation.retain-seconds</name>
  <value>3600</value>
    </property>

配置完yarn-site.xml之后重启节点,再次查看日志

[hadoop@hadoop001 shell]$ yarn logs -applicationId application_1420997455428_0005
15/01/12 09:53:43 INFO client.RMProxy: Connecting to ResourceManager at /0.0.0.0:8032
Container: container_1421027534751_0001_01_000003 on hadoop001_40205
======================================================================
LogType:stderr
Log Upload Time:Mon Jan 12 09:53:04 +0800 2015
LogLength:7912
Log Contents:
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/tmp/hadoop-hadoop/nm-local-dir/usercache/hadoop/filecache/10/__spark_libs__7968601071120350016.zip/slf4j-log4j12-1.7.16.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/home/hadoop/app/hadoop-2.6.0-cdh5.7.0/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
15/01/12 09:52:58 INFO executor.CoarseGrainedExecutorBackend: Started daemon with process name: 31008@hadoop001
15/01/12 09:52:58 INFO util.SignalUtils: Registered signal handler for TERM
15/01/12 09:52:58 INFO util.SignalUtils: Registered signal handler for HUP
15/01/12 09:52:58 INFO util.SignalUtils: Registered signal handler for INT
15/01/12 09:52:59 INFO spark.SecurityManager: Changing view acls to: hadoop
15/01/12 09:52:59 INFO spark.SecurityManager: Changing modify acls to: hadoop
15/01/12 09:52:59 INFO spark.SecurityManager: Changing view acls groups to: 
15/01/12 09:52:59 INFO spark.SecurityManager: Changing modify acls groups to: 

注:1.查看某个应用的日志

yarn logs -applicationId application_1420997455428_0005

2.查看某个应用的状态

yarn application -status application_1420997455428_0005

3.杀掉某个应用
(直接在UI界面或者是终端kill掉任务都是不对的,该任务可能还会继续执行下去,所以要用如下命令才算完全停止该应用的执行)

yarn application -kill application_1420997455428_0005

猜你喜欢

转载自blog.csdn.net/qq_42694416/article/details/85252277