Cannot create directory /user/hive. Name node is in safe mode.The reported blocks 1 needs additiona

org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.hdfs.server.namenode.SafeModeException): Cannot create directory /user/hive. Name node is in safe mode.
The reported blocks 1 needs additional 4 blocks to reach the threshold 0.9990 of total blocks 5.
The number of live datanodes 1 has reached the minimum number 0. Safe mode will be turned off automatically once the thresholds have been reached.
    at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkNameNodeSafeMode(FSNamesystem.java:1335)
    at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirs(FSNamesystem.java:3871)
    at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.mkdirs(NameNodeRpcServer.java:984)
    at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.mkdirs(ClientNamenodeProtocolServerSideTranslatorPB.java:634)
    at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
    at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:616)
    at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:982)
    at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2217)
    at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2213)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:422)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1758)
    at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2213)

    at org.apache.hadoop.ipc.Client.call(Client.java:1476)
    at org.apache.hadoop.ipc.Client.call(Client.java:1413)
    at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229)
    at com.sun.proxy.$Proxy10.mkdirs(Unknown Source)
    at  
    at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirs(DistributedFileSystem.java:1036)
    at org.apache.hadoop.fs.FileSystem.mkdirs(FileSystem.java:1880)
    at com.yc.hdfs.MyFileSystem.createDirctor(MyFileSystem.java:80)
    at com.yc.hdfs.MyFileSystem.main(MyFileSystem.java:151)

When the call is connected hdfs above API to create a file in Java, reported the mistake, which is Hadoop into safe mode.

Hadoop ui interface of this explanation is the following:

Security is off.

Safe mode is ON. The reported blocks 1 needs additional 4 blocks to reach the threshold 0.9990 of total blocks 5. The number of live datanodes 1 has reached the minimum number 0. Safe mode will be turned off automatically once the thresholds have been reached.

22 files and directories, 5 blocks = 27 total filesystem object(s).

Heap Memory used 139.39 MB of 201.5 MB Heap Memory. Max Heap Memory is 889 MB.

Non Heap Memory used 51.56 MB of 52.58 MB Commited Non Heap Memory. Max Non Heap Memory is -1 B.

Hadoop is the root cause of the error into safe mode, in safe mode only, read and download, can not be modified to delete and create deletion and addition.

Will enter safe mode of the computer suddenly loses power may be crashed.

Because of my pseudo-distributed, no other copies of the mechanism, the direct use to leave the secure mode, after incomplete files are deleted

hdfs dfsadmin -safemode leave

 

 

Guess you like

Origin blog.csdn.net/weixin_40126236/article/details/86260649