[Abnormal] Specified key was too long; max key length is 767 bytes, because the HDFS solve all formatting Hive cause data loss problem

It may appear hive in a very common abnormalities:
Specified Key WAS TOO Long; max length Key IS 767 bytes?
Is mysql / hive character set issues. Need to change the hive yuan database character set: mysql> alert database ruozedata character set latin1 then restart the hive, restart mysql.

 

 

Because HDFS formatted, all the Hive cause data loss, so the connection is not on the Hive. So the solution is:
1) to remove the mysql database (note Hive created: no default database).
2) Modify the Hive hive-site.xml mysql stored in the metadata, the metadata into another database ruozedata1:
<Property>
<name> javax.jdo.option.ConnectionURL </ name>
<value> JDBC: mysql: localhost //: 3306 / ruozedata1 to true createDatabaseIfNotExist = & amp; = UTF-characterEncoding. 8 </ value>?
</ Property>

3) Create a ruozedata1 database in mysql.
4) Start Hive: hive

Generally this operation in the production is not there, generally does not remove all of the data on HDFS.

Guess you like

Origin www.cnblogs.com/huomei/p/12103651.html