hadoop相关端口说明

http相关端口:

Daemon Default Port Configuration Parameter
HDFS Namenode 50070 dfs.http.address
Datanodes 50075 dfs.datanode.http.address
Secondarynamenode 50090 dfs.secondary.http.address
Backup/Checkpoint node 50105 dfs.backup.http.address
MapReduce Jobracker 50030 mapred.job.tracker.http.address
Tasktrackers 50060 mapred.task.tracker.http.address
 Replaces secondarynamenode in 0.21.

 

 

hadoop相关进程服务端口:

Daemon Default Port Configuration Parameter Protocol Used for
Namenode 8020 fs.default.name① IPC: ClientProtocol Filesystem metadata operations.
Datanode 50010 dfs.datanode.address Custom Hadoop Xceiver: DataNodeand DFSClient DFS data transfer
Datanode 50020 dfs.datanode.ipc.address IPC:InterDatanodeProtocol,ClientDatanodeProtocol
ClientProtocol
Block metadata operations and recovery
Backupnode 50100 dfs.backup.address Same as namenode HDFS Metadata Operations
Jobtracker Ill-defined.② mapred.job.tracker IPC:JobSubmissionProtocol,InterTrackerProtocol Job submission, task tracker heartbeats.
Tasktracker 127.0.0.1:0 ③ mapred.task.tracker.report.address IPC:TaskUmbilicalProtocol Communicating with child jobs
① This is the port part of hdfs://host:8020/.
 Default is not well-defined. Common values are 8021, 9001, or 8012. See MAPREDUCE-566.
③ Binds to an unused local port.

 

摘录:http://www.cloudera.com/blog/2009/08/hadoop-default-ports-quick-reference/

-- end --

猜你喜欢

转载自heipark.iteye.com/blog/1199297