Spark 启动 java.lang.ClassNotFoundException: parquet.hadoop.ParquetOutputCommitter

Spark启动报 java.lang.ClassNotFoundException: parquet.hadoop.ParquetOutputCommitter
我安装的是hadoop-2.6.0-cdh5.12.1和spark-1.6.0-cdh5.12.1
解决的版本是
将下面的jar包下载下来放到Spark的启动ClassPath下,然后重启Spark

<dependency>
    <groupId>com.twitter</groupId>
    <artifactId>parquet-hadoop</artifactId>
    <version>1.4.3</version>
</dependency>

执行

bin/spark-shell

可以正常启动

levin@Levin-PC:~/install-soft/bigdata/spark-1.6.0-cdh5.12.1/bin$ ./spark-shell 
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel).
Welcome to
      ____              __
     / __/__  ___ _____/ /__
    _\ \/ _ \/ _ `/ __/  '_/
   /___/ .__/\_,_/_/ /_/\_\   version 1.6.0
      /_/

Using Scala version 2.10.5 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_144)
Type in expressions to have them evaluated.
Type :help for more information.
17/09/19 16:40:28 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
17/09/19 16:40:29 WARN util.Utils: Your hostname, Levin-PC resolves to a loopback address: 127.0.1.1; using 10.236.103.148 instead (on interface enp0s3)
17/09/19 16:40:29 WARN util.Utils: Set SPARK_LOCAL_IP if you need to bind to another address
Spark context available as sc (master = local[*], app id = local-1505810430958).
SQL context available as sqlContext.
scala> 

猜你喜欢

转载自blog.csdn.net/q2365921/article/details/78032060
今日推荐