[Original] Uncle Experience Sharing (84) spark sql set hive.exec.max.dynamic.partitions invalid

spark 2.4

 

spark sql executed

set hive.exec.max.dynamic.partitions=10000;

Then still be executed sql error:

org.apache.hadoop.hive.ql.metadata.HiveException:
Number of dynamic partitions created is 1001, which is more than 1000.
To solve this try to set hive.exec.max.dynamic.partitions to at least 1001.

The default value for this parameter is hive.exec.max.dynamic.partitions 1000, no modifications to take effect,

 

For the following reasons:

`HiveClient` does not know new value 1001. There is no way to change the default value of `hive.exec.max.dynamic.partitions` of `HiveCilent` with `SET` command.

The root cause is that `hive` parameters are passed to `HiveClient` on creating. So, the workaround is to use `--hiveconf` when starting `spark-shell`.

 

The workaround is to set hiveconf at startup spark-sql

spark-sql --hiveconf hive.exec.max.dynamic.partitions=10000

 

reference:

https://issues.apache.org/jira/browse/SPARK-19881

Guess you like

Origin www.cnblogs.com/barneywill/p/11618898.html