真坑!Spark Streaming 之 netty 冲突

最近在写一个Spark Streaming 处理 socket 数据的程序,代码被review没有问题,但是本地测试时就会报以下的错误:

Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
19/10/12 09:59:35 INFO SparkContext: Running Spark version 2.4.3
19/10/12 09:59:35 INFO SparkContext: Submitted application: networkWordCount
19/10/12 09:59:35 INFO SecurityManager: Changing view acls to: wp200
19/10/12 09:59:35 INFO SecurityManager: Changing modify acls to: wp200
19/10/12 09:59:35 INFO SecurityManager: Changing view acls groups to: 
19/10/12 09:59:35 INFO SecurityManager: Changing modify acls groups to: 
19/10/12 09:59:35 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(wp200); groups with view permissions: Set(); users  with modify permissions: Set(wp200); groups with modify permissions: Set()
Exception in thread "main" java.lang.NoSuchMethodError: io.netty.buffer.PooledByteBufAllocator.defaultUseCacheForAllThreads()Z
	at org.apache.spark.network.util.NettyUtils.createPooledByteBufAllocator(NettyUtils.java:120)
	at org.apache.spark.network.server.TransportServer.init(TransportServer.java:98)
	at org.apache.spark.network.server.TransportServer.<init>(TransportServer.java:75)
	at org.apache.spark.network.TransportContext.createServer(TransportContext.java:114)
	at org.apache.spark.rpc.netty.NettyRpcEnv.startServer(NettyRpcEnv.scala:119)
	at org.apache.spark.rpc.netty.NettyRpcEnvFactory$$anonfun$4.apply(NettyRpcEnv.scala:465)
	at org.apache.spark.rpc.netty.NettyRpcEnvFactory$$anonfun$4.apply(NettyRpcEnv.scala:464)
	at org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:2269)
	at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:160)
	at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:2261)
	at org.apache.spark.rpc.netty.NettyRpcEnvFactory.create(NettyRpcEnv.scala:469)
	at org.apache.spark.rpc.RpcEnv$.create(RpcEnv.scala:57)
	at org.apache.spark.SparkEnv$.create(SparkEnv.scala:249)
	at org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:175)
	at org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:257)
	at org.apache.spark.SparkContext.<init>(SparkContext.scala:424)
	at com.github.skywp.train.NetworkWordCount$.main(NetworkWordCount.scala:15)
	at com.github.skywp.train.NetworkWordCount.main(NetworkWordCount.scala)

原pom.xml

 <dependency>
            <groupId>org.scala-lang</groupId>
            <artifactId>scala-library</artifactId>
            <version>${scala.version}</version>
        </dependency>
     

        <!-- hadoop 依赖-->
        <dependency>
            <groupId>org.apache.hadoop</groupId>
            <artifactId>hadoop-client</artifactId>
            <version>2.6.5</version>
        </dependency>
        <!-- hbase 依赖 -->
        <!-- https://mvnrepository.com/artifact/org.apache.hbase/hbase-server -->
        <dependency>
            <groupId>org.apache.hbase</groupId>
            <artifactId>hbase-server</artifactId>
            <version>1.3.5</version>
        </dependency>
        <!-- hbase 依赖 -->
        <!-- https://mvnrepository.com/artifact/org.apache.hbase/hbase-client -->
        <dependency>
            <groupId>org.apache.hbase</groupId>
            <artifactId>hbase-client</artifactId>
            <version>1.3.5</version>
        </dependency>

        <!-- Spark Streaming 依赖-->
        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-streaming_2.11</artifactId>
            <version>${spark.version}</version>
            <!--<exclusions>-->
            <!--<exclusion>-->
            <!--<groupId>io.netty</groupId>-->
            <!--<artifactId>netty-all</artifactId>-->
            <!--</exclusion>-->
            <!--</exclusions>-->
        </dependency>


        <!-- Spark Streaming整合Flume 依赖-->
        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-streaming-flume_2.11</artifactId>
            <version>${spark.version}</version>
        </dependency>

        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-streaming-flume-sink_2.11</artifactId>
            <version>${spark.version}</version>
        </dependency>

        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-streaming-kafka-0-8_2.11</artifactId>
            <version>${spark.version}</version>
        </dependency>

        <dependency>
            <groupId>org.apache.commons</groupId>
            <artifactId>commons-lang3</artifactId>
            <version>3.5</version>
        </dependency>

        <!-- Spark SQL 依赖-->
        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-sql_2.11</artifactId>
            <version>${spark.version}</version>

        </dependency>

        <dependency>
            <groupId>com.fasterxml.jackson.module</groupId>
            <artifactId>jackson-module-scala_2.11</artifactId>
            <version>2.6.5</version>
        </dependency>

        <dependency>
            <groupId>net.jpountz.lz4</groupId>
            <artifactId>lz4</artifactId>
            <version>1.3.0</version>
        </dependency>

        <dependency>
            <groupId>mysql</groupId>
            <artifactId>mysql-connector-java</artifactId>
            <version>5.1.38</version>
        </dependency>

        <dependency>
            <groupId>org.apache.flume.flume-ng-clients</groupId>
            <artifactId>flume-ng-log4jappender</artifactId>
            <version>1.6.0</version>
        </dependency>

解决过程:

曾试图将spark-streaming中的netty-all排除,but not works.

解决方法:

找了好久,才发现是Spark中引用的netty版本过旧,没有这个方法。

最终引入一个较新版本的netty-all,问题解决。

  <dependency>
     <groupId>io.netty</groupId>
     <artifactId>netty-all</artifactId>
     <version>4.1.17.Final</version>
  </dependency>
发布了29 篇原创文章 · 获赞 0 · 访问量 4327

猜你喜欢

转载自blog.csdn.net/u011536031/article/details/102515075