Cannot create directory /user/hive. Name node is in safe mode.The reported blocks 1 needs additiona

org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.hdfs.server.namenode.SafeModeException): Cannot create directory /user/hive. Name node is in safe mode.
The reported blocks 1 needs additional 4 blocks to reach the threshold 0.9990 of total blocks 5.
The number of live datanodes 1 has reached the minimum number 0. Safe mode will be turned off automatically once the thresholds have been reached.
    at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkNameNodeSafeMode(FSNamesystem.java:1335)
    at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirs(FSNamesystem.java:3871)
    at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.mkdirs(NameNodeRpcServer.java:984)
    at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.mkdirs(ClientNamenodeProtocolServerSideTranslatorPB.java:634)
    at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
    at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:616)
    at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:982)
    at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2217)
    at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2213)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:422)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1758)
    at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2213)

    at org.apache.hadoop.ipc.Client.call(Client.java:1476)
    at org.apache.hadoop.ipc.Client.call(Client.java:1413)
    at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229)
    at com.sun.proxy.$Proxy10.mkdirs(Unknown Source)
    at  
    at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirs(DistributedFileSystem.java:1036)
    at org.apache.hadoop.fs.FileSystem.mkdirs(FileSystem.java:1880)
    at com.yc.hdfs.MyFileSystem.createDirctor(MyFileSystem.java:80)
    at com.yc.hdfs.MyFileSystem.main(MyFileSystem.java:151)

这个是在Java调用API连接hdfs上面创建文件时,报的错误,是Hadoop进入了安全模式。

Hadoop的ui界面是以下这个解释的:

Security is off.

Safe mode is ON. The reported blocks 1 needs additional 4 blocks to reach the threshold 0.9990 of total blocks 5. The number of live datanodes 1 has reached the minimum number 0. Safe mode will be turned off automatically once the thresholds have been reached.

22 files and directories, 5 blocks = 27 total filesystem object(s).

Heap Memory used 139.39 MB of 201.5 MB Heap Memory. Max Heap Memory is 889 MB.

Non Heap Memory used 51.56 MB of 52.58 MB Commited Non Heap Memory. Max Non Heap Memory is -1 B.

报错的根本原因是Hadoop进入了安全模式,在安全模式下只能,进行读取和下载,不能进行修改删改删除和创建以及添加。

之所会进入安全模式可能是电脑突然断电死机了。

由于本人的伪分布式,没有其他的副本机制,就直接使用离开安全模式,之后将不完整的文件都删除了

hdfs dfsadmin -safemode leave

猜你喜欢

转载自blog.csdn.net/weixin_40126236/article/details/86260649