HDFS basic shell operation

Insert picture description here

Common hdfs commands

The hdfs command requires the prefix hdfs dfs xx, which is equivalent to hadoop fs xx, just choose the method you like.

Command overview

View files recursively

hadoop fs -ls -R /path

Create a multi-level directory

hadoop fs -mkdir -p /abc/mm/ff

Upload multiple files at the same time

hadoop fs -put abc1.txt abc2.txt /abc

In this way, upload abc1.txt and abc2.txt to / abc below, I uploaded two files, in fact, how many files you want to upload, just go directly, hdfs will default to the last directory hdfs The directory to upload to
Insert picture description here

Download multiple files and merge

hdfs dfs -getmerge /abc/abc1.txt /abc/abc2.txt /home/hadoop/abc3.txt

The command we use is getmerge , and then keep up with the files that need to be downloaded. Here I have specified two files. Actually, multiple files can be specified. As long as it is a file on hdfs, it will default to the last directory as a local directory. Then we specify that the downloaded abc1.txt and abc2.txt are automatically merged into abc3.txt after downloading

Move file

hadoop fs -mv /abc/abc1.txt /abc/mm

Insert picture description here

Copy file

hadoop fs -cp /abc/mm/123.txt /abc

Insert picture description here

Other common commands

hadoop fs -tail -f     # 监听文件

hadoop fs chmod 文件权限 文件     # 修改文件的权限 
Published 9 original articles · Likes0 · Visits 64

Guess you like

Origin blog.csdn.net/yangbllove/article/details/105443665