上载数据时出错: KeyValue size too large

上载数据时出错: KeyValue size too large
问题如下:

Exception in thread "main" java.lang.IllegalArgumentException: KeyValue size too large at 
org.apache.hadoop.hbase.client.HTable.validatePut(HTable.java:1545) at 
org.apache.hadoop.hbase.client.BufferedMutatorImpl.validatePut(BufferedMutatorImpl.java:175) at 
org.apache.hadoop.hbase.client.BufferedMutatorImpl.mutate(BufferedMutatorImpl.java:146) at 
mil.nga.giat.geowave.datastore.hbase.operations.HBaseWriter.writeMutations(HBaseWriter.java:94) at 
mil.nga.giat.geowave.datastore.hbase.operations.HBaseWriter.write(HBaseWriter.java:88) at 
mil.nga.giat.geowave.datastore.hbase.operations.HBaseWriter.write(HBaseWriter.java:81) at 
mil.nga.giat.geowave.core.store.base.BaseIndexWriter.write(BaseIndexWriter.java:99) at 
mil.nga.giat.geowave.core.store.base.BaseIndexWriter.write(BaseIndexWriter.java:72) at 
mil.nga.giat.geowave.core.store.index.writer.IndexCompositeWriter.write(IndexCompositeWriter.java:49) at 
com.cwgis.importFeatureHBase.importData(importFeatureHBase.java:120) at com.cwgis.App.import_db(App.java:115) at 
com.cwgis.App.main(App.java:51)

hbase. client. keyvalue. maxsize的设置问题

我的解决办法:
集群参数hbase-site.xml文件中的设置如下:

<!--keyval设置最大值500M-->
<property>
   <name>hbase.client.keyvalue.maxsize</name>
   <value>524288000</value>
</property>
<property>
  <name>hbase.server.keyvalue.maxsize</name>
  <value>524288000</value>    
</property>

重启hbase集群
客户端的lib库中修改hbase-shaded-client-1.4.6.jar和hbase-shaded-server-1.4.6.jar文件中hbase-default.xml参数

<property>
   <name>hbase.client.keyvalue.maxsize</name>
   <value>524288000</value>
</property>
<property>
  <name>hbase.server.keyvalue.maxsize</name>
  <value>524288000</value>    
</property>

—the—end----

猜你喜欢

转载自blog.csdn.net/hsg77/article/details/88314905
今日推荐