Hadoop access HDFS via path and link

If you want to access hdfs locally on the Hadoop server through an absolute path such as "/user/hadoop", you also want to access hdfs through "hdfs://local"

To access hdfs in host:9000/user/hadoop, you need to configure core-site.xml:

<property>  
    <name>fs.defaultFS</name>  
    <value>hdfs://master.domain.com:9000</value>  
</property>  

The above configuration only allows you to use "hadoop fs -ls /user/hadoop" to view hdfs, the link method also needs to modify hdfs-site.xml:

<property>  
    <name>dfs.namenode.rpc-address</name>  
    <value>master.domain.com:9000</value>  
</property>  

At this point, if your server is placed in the computer room, and the "master.domain.com" domain name is already on all hadoop cluster machines

 

Point to the intranet IP of the namenode, then the above configuration will only allow you to pass "hdfs://local" on the intranet machine where the hadoop cluster is located

host:9000/user/hadoop" to access hdfs, if you need to access hadoop's hdfs outside the computer room, you need to add the following in hdfs-site.xml

content:

<property>  
    <name>dfs.namenode.rpc-bind-host</name>  
    <value>0.0.0.0</value>  
</property> 

The host "0.0.0.0" will replace the domain name of "dfs.namenode.rpc-address" above. At this time, restart hadoop, and hadoop will

 

Monitor port 9000 of both internal and external network cards.

Guess you like

Origin http://43.154.161.224:23101/article/api/json?id=324699401&siteId=291194637