Windows hadoop 64位安装


配置好后,start-all出现
FATAL org.apache.hadoop.hdfs.server.datanode.DataNode: Exception in secureMain
java.lang.UnsatisfiedLinkError: org.apache.hadoop.io.nativeio.SharedFileDescriptorFactory.createDescriptor0(Ljava/lang/String;Ljava/lang/St
ring;I)Ljava/io/FileDescriptor;
        at org.apache.hadoop.io.nativeio.SharedFileDescriptorFactory.createDescriptor0(Native Method)
        at org.apache.hadoop.io.nativeio.SharedFileDescriptorFactory.create(SharedFileDescriptorFactory.java:87)
        at org.apache.hadoop.hdfs.server.datanode.ShortCircuitRegistry.<init>(ShortCircuitRegistry.java:169)
        at org.apache.hadoop.hdfs.server.datanode.DataNode.initDataXceiver(DataNode.java:586)
        at org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:773)
        at org.apache.hadoop.hdfs.server.datanode.DataNode.<init>(DataNode.java:292)
        at org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:1893)
        at org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:1780)
        at org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:1827)
        at org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:2003)
        at org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:2027)

如果用32位的jdk启动64位的hadoop也报这个错,改成64位的jdk就好了
参考http://www.cnblogs.com/fanfanfantasy/p/4123412.html

猜你喜欢

转载自blog.csdn.net/sinat_32258909/article/details/78279278
今日推荐