common errors solution

1.when i create a hive table in hue, there errors comes



 

Solution:#hadoop dfsadmin -safemode leave

http://www.linkedin.com/groups/Creating-table-in-Hive-getting-4547204.S.225243871

2.error



 

 solution:

edit hadoop-env.sh

# The maximum amount of heap to use, in MB. Default is 1000.
export HADOOP_HEAPSIZE=2000
#export HADOOP_NAMENODE_INIT_HEAPSIZE=""

扫描二维码关注公众号,回复: 627326 查看本文章

3. error hadoop \001 job.xml  Character reference "&#

https://hadoopified.wordpress.com/2011/06/24/unicode-charactersctrl-g-or-ctrl-a-as-textoutputformat-hadoop-delimiter/

Another hack, would be to provide the delimiter through an XML resource file. The xml version needs to be marked 1.1, since 1.0 fails to recognize the special unicode characters. The XML 1.0 spec explicitly omitted most of the non-printing characters in the range 0x00 to 0x1F.

Name: mapred.textoutputformat.separator
Value: \u0007

 

<?xml version="1.1"?>

<?xml-stylesheet type="text/xsl" href="configuration.xsl"?>

<configuration>

<property>

<name>hadoop.user</name>

<value>${user.name}</value>

</property>

<property>

<name>mapred.textoutputformat.separator</name>

<value>\u0007</value>

</property>

</configuration>

job.xml

This file is never created explicitly by the user. The map/reduce application creates a JobConf, which is serialized when the job is submitted.

猜你喜欢

转载自ylzhj02.iteye.com/blog/2071088
今日推荐