一、Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.hive.conf.HiveConf$ConfVars
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/hadoop/hive/conf/HiveConf$ConfVars
at org.apache.hive.hcatalog.common.HCatConstants.<clinit>(HCatConstants.java:74)
at org.apache.sqoop.mapreduce.hcat.SqoopHCatUtilities.configureHCat(SqoopHCatUtilities.java:297)
at org.apache.sqoop.mapreduce.hcat.SqoopHCatUtilities.configureImportOutputFormat(SqoopHCatUtilities.java:783)
at org.apache.sqoop.mapreduce.ImportJobBase.configureOutputFormat(ImportJobBase.java:98)
at org.apache.sqoop.mapreduce.ImportJobBase.runImport(ImportJobBase.java:259)
at org.apache.sqoop.manager.SqlManager.importTable(SqlManager.java:673)
at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:497)
at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:605)
at org.apache.sqoop.Sqoop.run(Sqoop.java:143)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:179)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:218)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:227)
at org.apache.sqoop.Sqoop.main(Sqoop.java:236)
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.hive.conf.HiveConf$ConfVars
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:335)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 14 more
In...
Sqoop lacks several hive packages
cp $HIVE_HOME/lib/hive-shims* $SQOOP_HOME/lib/
二、ERROR sqoop.Sqoop: Got exception running Sqoop: java.lang.NullPointerException
In...
Modify sqoop-env.sh
Add ($HIVE_HOME is the absolute path of hive, it can be written like this when equipped with hive environment variables)
export HCAT_HOME=$HIVE_HOME/hcatalog
三、ERROR tool.ImportTool: Encountered IOException running import job: java.io.IOException: HCat exited with status 1
In...
There is the path location of the error log file in the INFO above the exception, but the error is not found in the log file search, which is very annoying and unsolved.
There are dalao who know the reason and the solution for advice! thank!
------------------------------------------------------------------------------------------------------------------------------------------------
I executed this command to import and create orc partition table to hive
sqoop import \
--connect jdbc:mysql://agent:3306/intelligentCoal \
--username root \
--password 123456 \
--table t_user \
--driver com.mysql.jdbc.Driver \
--hcatalog-database intelligentCoal \
--create-hcatalog-table \
--hcatalog-table t_user_orc \
--hcatalog-partition-keys event_month \
--hcatalog-partition-values 202010 \
--hcatalog-storage-stanza 'stored as orc tblproperties ("orc.compress"="SNAPPY")' \
-m 1
-------------------------------------------
I create the orc partition table first, and the above command removes --create-hcatalog-table to import successfully
sqoop import \
--connect jdbc:mysql://agent:3306/intelligentCoal \
--username root \
--password 123456 \
--table t_user \
--driver com.mysql.jdbc.Driver \
--hcatalog-database intelligentCoal \
--hcatalog-table t_user_orc \
--hcatalog-partition-keys event_month \
--hcatalog-partition-values 202010 \
--hcatalog-storage-stanza 'stored as orc tblproperties ("orc.compress"="SNAPPY")' \
-m 1
四、Caused by: java.sql.SQLIntegrityConstraintViolationException: ORA-01400: cannot insert NULL
Appears when sqoop is exported from hive to Oracle
org.apache.sqoop.mapreduce.AsyncSqlOutputFormat: Got exception in update thread: java.sql.SQLIntegrityConstraintViolationException: ORA-01400: cannot insert NULL into ("ICI"."TAB_TASK"."TASK_FUNC")
Check that the TASK_FUNC field in the TAB_TASK table has a non-empty constraint
Remove the non-empty constraint of the task_func field, or supplement the data corresponding to the field