HDFS java API TROUBLESHOOTING

The official document :
https://hadoop.apache.org/docs/r2.9.2/hadoop-project-dist/hadoop-common/SingleCluster.html

Free dense configuration log, for communicating with dataNode nameNode

ssh-keygen -t rsa
cat ~/.ssh/id_dsa.pub >> ~/.ssh/authorized_keys

Verify ssh, you do not need to enter your password. After logging in to perform exit to exit.

ssh localhost
exist


etc/hadoop/core-site.xml

<configuration>
    <property>
        <name>fs.defaultFS</name>
        <value>hdfs://192.168.3.127:8020</value>
    </property>
</configuration>


etc/hadoop/hdfs-site.xml

<configuration>
    <property>
        <name>dfs.replication</name>
        <value>1</value>
    </property>

    <property>
        <name>dfs.name.dir</name>
        <value>file:/home/hdfs/name</value>
        <description>namenode上存储hdfs名字空间元数据 </description> 
    </property>

    <property>
        <name>dfs.data.dir</name>
        <value>file:/home/hdfs/data</value>
        <description>datanode上数据块的物理存储位置</description>
    </property>
</configuration>


Open ports

firewall-cmd --add-port=8020/tcp --permanent
firewall-cmd --add-port=50010/tcp --permanent
firewall-cmd --add-port=50070/tcp --permanent
firewall-cmd --reload

 

1. java.lang.IllegalArgumentException: URI has an authority component
error in the implementation of `bin / hdfs namenode -format` of.
Check the hdfs-site.xml configuration is correct

<Property> 
<name> dfs.name.dir </ name> 
<value> File: / Home / hdfs / name </ value> 
<Description> hdfs namespace metadata stored on NameNode </ Description> 
</ Property> 

< Property> 
<name> dfs.data.dir </ name> 
<value> File: / Home / HDFS / data </ value> 
<Description> datanode physical storage locations on the data block </ Description> 
</ Property>

 

2. java.io.FileNotFoundException: HADOOP_HOME and hadoop.home.dir are unset.
解压 hadoop-2.9.2.tar.gz 到 D:\app\

System.setProperty("hadoop.home.dir", "D:\\app\\hadoop-2.9.2");

 

3. java.io.FileNotFoundException: Could not locate Hadoop executable
: D: \ app \ hadoop-2.9.2 \ bin \ winutils.exe downloaded into winutils.exe {HADOOP_HOME} \ bin \ under

 

4. Permission denied: user=xxx, access=WRITE, inode="/":root:supergroup:drwxr-xr-x

/ * * 
* Solve no access, remote setting of hadoop linux user name 
* / 
Private  static Final String the USER = " root " ; 

the FileSystem . = The FileSystem GET ( new new URI of the (HDFS_PATH), the Configuration, the USER);

5. java.net.ConnectException: Connection timed out: no further information 与 org.apache.hadoop.ipc.RemoteException: File /hello-hadoop.md could only be replicated to 0 nodes instead of minReplication (=1). There are 1 datanode(s) running and 1 node(s) are excluded in this operation.

# DataNode port open 
Firewall -cmd --add-Port = 50010 / tcp - Permanent 
Firewall -cmd --reload

 

6. No FileSystem for scheme "hdfs"

 

<dependency>
    <groupId>org.apache.hadoop</groupId>
    <artifactId>hadoop-hdfs</artifactId>
    <version>${org.apache.hadoop.version}</version>
</dependency>
<dependency>
    <groupId>org.apache.hadoop</groupId>
    <artifactId>hadoop-common</artifactId>
    <version>${org.apache.hadoop.version}</version>
</dependency>
<dependency>
    <groupId>org.apache.hadoop</groupId>
    <artifactId>hadoop-client</artifactId>
    <version>${org.apache.hadoop.version}</version>
</dependency>

 

Have questions please leave a message exchange.

Technical exchange group: 282 575 808

--------------------------------------

Disclaimer: original article without permission is prohibited reproduced!

--------------------------------------

Guess you like

Origin www.cnblogs.com/xxoome/p/11233794.html