Spark2 SQL动态分区报错

报错信息如下:

ERROR yarn.ApplicationMaster: User class threw exception: org.apache.spark.sql.AnalysisException: org.apache.hadoop.hive.ql.metadata.HiveException: Number of dynamic partitions created is 3464, which is more than 1000. To solve this try to set hive.exec.max.dynamic.partitions to at least 3464.;
org.apache.spark.sql.AnalysisException: org.apache.hadoop.hive.ql.metadata.HiveException: Number of dynamic partitions created is 3464, which is more than 1000. To solve this try to set hive.exec.max.dynamic.partitions to at least 3464.;
	at org.apache.spark.sql.hive.HiveExternalCatalog.withClient(HiveExternalCatalog.scala:107)
	at org.apache.spark.sql.hive.HiveExternalCatalog.loadDynamicPartitions(HiveExternalCatalog.scala:829)
	at org.apache.spark.sql.hive.execution.InsertIntoHiveTable.sideEffectResult$lzycompute(InsertIntoHiveTable.scala:319)
	at org.apache.spark.sql.hive.execution.InsertIntoHiveTable.sideEffectResult(InsertIntoHiveTable.scala:221)
	at org.apache.spark.sql.hive.execution.InsertIntoHiveTable.doExecute(InsertIntoHiveTable.scala:413)
	at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:114)
	at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:114)
	at org.apache.spark.sql.execution.SparkPlan$$anonfun$executeQuery$1.apply(SparkPlan.scala:135)

经过查找修改添加spark代码

hiveContext.sql("hive.exec.dynamic.partition=true")
hiveContext.sql("set hive.exec.dynamic.partition.mode=nonstrict")
hiveContext.sql("SET hive.exec.max.dynamic.partitions=100000")
hiveContext.sql("SET hive.exec.max.dynamic.partitions.pernode=100000")

依旧报错。。
https://github.com/apache/spark/pull/17223
经过网上提示,spark2以后不支持代码修改hive配置信息。。

于是尝试修改hive-site.xml,添加如下配置:

<property>
      <name>hive.exec.dynamic.partition</name>
      <value>true</value>
    </property>
  <property>
      <name>hive.exec.dynamic.partition.mode</name>
      <value>nonstrict</value>
  </property>
  <property>
      <name>hive.exec.max.dynamic.partitions</name>
      <value>100000</value>
   </property>
   <property>
      <name>hive.exec.max.dynamic.partitions.pernode</name>
      <value>100000</value>
  </property>

功夫不负有心人,终于运行成功了!

发布了118 篇原创文章 · 获赞 25 · 访问量 15万+

猜你喜欢

转载自blog.csdn.net/lhxsir/article/details/100155322
今日推荐