Solve the problem of Chinese and table data garbled characters in Hadoop cluster hive database

Recently, in the test environment, after DDL created the table, it was found that Chinese comments and table data were garbled, as follows

 query metadata

The reason is that the character set supported by hive's metastore is latin1, so there will be encoding problems when writing Chinese

The solution is as follows:

Encoding settings for MySQL

[client] Add below

default-character-set=utf8

Add under [mysqld]

default-character-set=utf8

init_connect='SET NAMES utf8'

Guess you like

Origin blog.csdn.net/u010438126/article/details/131777356