Spark 日志错误信息分析及解决方案:log4j、SLF4j

版权声明:本文为博主原创文章,未经博主允许不得转载。 https://blog.csdn.net/anitinaj/article/details/80902083
Spark 日志错误信息
异常信息:( 解决了好久的问题 )
1、log4j错误类「org.apache.log4j.Appender」被加载,「org.apache.log4j.ConsoleAppender」不能分配给「org.apache.log4j.Appender」,导致sparkContext初始化失败
log4j:ERROR A "org.apache.log4j.ConsoleAppender" object is not assignable to a "org.apache.log4j.Appender" variable.
log4j:ERROR The class "org.apache.log4j.Appender" was loaded by
log4j:ERROR [sun.misc.Launcher$AppClassLoader@55f96302] whereas object of type
log4j:ERROR "org.apache.log4j.ConsoleAppender" was loaded by [org.apache.spark.util.ChildFirstURLClassLoader@687080dc].
log4j:ERROR Could not instantiate appender named "console".
log4j:ERROR A "org.apache.hadoop.log.metrics.EventCounter" object is not assignable to a "org.apache.log4j.Appender" variable.
log4j:ERROR The class "org.apache.log4j.Appender" was loaded by
log4j:ERROR [sun.misc.Launcher$AppClassLoader@55f96302] whereas object of type
log4j:ERROR "org.apache.hadoop.log.metrics.EventCounter" was loaded by [org.apache.spark.util.ChildFirstURLClassLoader@687080dc].
log4j:ERROR Could not instantiate appender named "EventCounter".
log4j:ERROR A "org.apache.log4j.ConsoleAppender" object is not assignable to a "org.apache.log4j.Appender" variable.
log4j:ERROR The class "org.apache.log4j.Appender" was loaded by
log4j:ERROR [sun.misc.Launcher$AppClassLoader@55f96302] whereas object of type
log4j:ERROR "org.apache.log4j.ConsoleAppender" was loaded by [org.apache.spark.util.ChildFirstURLClassLoader@687080dc].
log4j:ERROR Could not instantiate appender named "console".
log4j:ERROR A "org.apache.log4j.varia.NullAppender" object is not assignable to a "org.apache.log4j.Appender" variable.
log4j:ERROR The class "org.apache.log4j.Appender" was loaded by
log4j:ERROR [sun.misc.Launcher$AppClassLoader@55f96302] whereas object of type
log4j:ERROR "org.apache.log4j.varia.NullAppender" was loaded by [org.apache.spark.util.ChildFirstURLClassLoader@687080dc].
log4j:ERROR Could not instantiate appender named "NullAppender".
log4j:ERROR A "org.apache.log4j.ConsoleAppender" object is not assignable to a "org.apache.log4j.Appender" variable.
log4j:ERROR The class "org.apache.log4j.Appender" was loaded by
log4j:ERROR [sun.misc.Launcher$AppClassLoader@55f96302] whereas object of type
log4j:ERROR "org.apache.log4j.ConsoleAppender" was loaded by [org.apache.spark.util.ChildFirstURLClassLoader@687080dc].
log4j:ERROR Could not instantiate appender named "console".
log4j:ERROR A "org.apache.log4j.varia.NullAppender" object is not assignable to a "org.apache.log4j.Appender" variable.
log4j:ERROR The class "org.apache.log4j.Appender" was loaded by
log4j:ERROR [sun.misc.Launcher$AppClassLoader@55f96302] whereas object of type
log4j:ERROR "org.apache.log4j.varia.NullAppender" was loaded by [org.apache.spark.util.ChildFirstURLClassLoader@687080dc].
log4j:ERROR Could not instantiate appender named "NullAppender".
log4j:ERROR A "org.apache.log4j.varia.NullAppender" object is not assignable to a "org.apache.log4j.Appender" variable.
log4j:ERROR The class "org.apache.log4j.Appender" was loaded by
log4j:ERROR [sun.misc.Launcher$AppClassLoader@55f96302] whereas object of type
log4j:ERROR "org.apache.log4j.varia.NullAppender" was loaded by [org.apache.spark.util.ChildFirstURLClassLoader@687080dc].
log4j:ERROR Could not instantiate appender named "NullAppender".
18/06/26 18:29:08 ERROR spark.SparkContext: Error initializing SparkContext.

2、SLF4J绑定类型和实际加载的类类型不匹配
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/local/spark-1.6.1/lib/spark-assembly-1.6.1-hadoop2.6.0.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/local/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
Exception in thread "main" java.lang.LinkageError: loader constraint violation: when resolving method "org.slf4j.impl.StaticLoggerBinder.getLoggerFactory()Lorg/slf4j/ILoggerFactory;" the class loader (instance of org/apache/spark/util/ChildFirstURLClassLoader) of the current class, org/slf4j/LoggerFactory, and the class loader (instance of sun/misc/Launcher$AppClassLoader) for the method's defining class, org/slf4j/impl/StaticLoggerBinder, have different Class objects for the type org/slf4j/ILoggerFactory used in the signature

原因分析:
1、多个log4j「jar包」被引用,导致应用程序环境的「jar」与集群环境的「jar」版本不一致(冲突)导致异常
2、 应用程序初始化加载的 「SLF4J」的「jar」与 集群环境绑定的「SLF4J」的「jar」包版本不一致,导致异常信息
3、导致日志错误信息的根本原因
        a、应用程序本身引入spark相关的依赖包
        b、提交作业时,使用如下配置,导致应用程序本身的log4j首先初始化,然后跟集群环境的log4j相关依赖包及配置信息冲突,多次初始化,导致失败
        --conf spark.executor.userClassPathFirst=true \
        --conf spark.driver.userClassPathFirst=true \

解决方案:
1、显示引入「jar」包:org.slf4j的 slf4j.api、 slf4j-log4j12、log4j三个包,并且slf4j-log4j12需要去除slf4j.api、和log4j引用。
2、同时需要将「slf4j」和「log4j」相关的「jar包」移除fight「jar」,并且移除「spark」相关「jar包」的引用。
如下:
< dependency >
    < groupId >org.slf4j </ groupId >
    < artifactId >slf4j-api </ artifactId >
    < version >1.7.16 </ version >
    < scope > provided </ scope >
</ dependency >
< dependency >
    < groupId >org.slf4j </ groupId >
    < artifactId >slf4j-log4j12 </ artifactId >
    < version >1.7.16 </ version >
    <exclusions>
        <exclusion>
            <groupId>org.slf4j</groupId>
            <artifactId>slf4j-api</artifactId>
        </exclusion>
        <exclusion>
            <groupId>log4j</groupId>
            <artifactId>log4j</artifactId>
        </exclusion>
    </exclusions>
    < scope > provided </ scope >
</ dependency >
< dependency >
    < groupId >log4j </ groupId >
    < artifactId >log4j </ artifactId >
    < version >1.2.17 </ version >
    < scope > provided </ scope >
</ dependency >
...
<dependency>
    < groupId > org.apache.spark </ groupId >
    < artifactId > spark-sql_${scala.binary.version} </ artifactId >
    < version > ${spark.version} </ version >
    < scope > provided </ scope >
</dependency>
<dependency>
    < groupId > org.apache.spark </ groupId >
    < artifactId > spark-mllib_${scala.binary.version} </ artifactId >
    < version > ${spark.version} </ version >
    < scope > provided </ scope >
</dependency>
<dependency>
    < groupId > org.apache.spark </ groupId >
    < artifactId > spark-streaming_${scala.binary.version} </ artifactId >
    < version > ${spark.version} </ version >
    < scope > provided </ scope >
</dependency>
<dependency>
    < groupId > org.apache.spark </ groupId >
    < artifactId > spark-streaming-kafka_2.10 </ artifactId >
    < version > 1.6.1 </ version >
</dependency>

猜你喜欢

转载自blog.csdn.net/anitinaj/article/details/80902083