Common file operation commands in HDFS

 HDFS common file operation commands

1.-cat
usage: hadoop fs -cat URI
description: output the file specified by the path to the screen
Example : hadoop fs -cat hdfs://host1:port1/file
hadoop fs -cat file:/ //file3

2.-copyFromLocal
Usage: hadoop fs -copyFromLocal <localsrc>URI
description: Copy local files to HDFS.

3. CopyToLocal
usage: hadoop fs -copyToLocal <localsrc> URI
description: Copy a file from the HDFS system to a local file

4.-cp
usage: hadoop fs -cp URI
description: Copy a file from the source path to the target path . This command can copy multiple source paths, but the destination path must
be a directory
Example: hadoop fs -cp /user/file /uesr/files
Hadoop fs -cp /user/file1 /user/files /user/dir

5.-du
How to use: hadoop fs -du URI
description: Display the size of all files in the directory, or when specifying a file, display the size of this file
Example : hadoop fs -du /user/dir1
Hadoop fs -du hdfs://host:port/user/file

6.-dus
usage: hadoop fs -dus <ars>
description: Display the target file size

7.-expunge
usage: hadoop fs -expunge
description: for Empty the recycle bin

8.-get
usage: hadoop fs -get <locaodst>
Description: Copy files to the local file system.
Example: hadoop fs -get /user/file localfile
Hadoop fs -get hdfs://host:port/file localfile

9.-ls
usage: hadoop fs -ls <arg>
Description: browse the local file and return it in the following format File information
file name <number
of copies> file size modification date permission user ID/group
ID /Group ID
Example: hadoop fs -ls /user/file
Hadoop fs -ls hdfs://host:port/user/dir

10.-lsr
Usage: hadoop fs -lsr
Description: Recursively check file content

11.-mkdir
Usage: hadoop fs -mkdir<path>
Description: Create the corresponding file directory and directly create the corresponding parent directory
Example : hadoop fs -mkdir /user/dir1/dir2/dir3/file
Hadoop fs -mkdir hdfs://host :port/user/dir

12.-mv
usage: hadoop fs -mv URI <dest>
Description: Move the source file to the destination path. There can be multiple destination paths, and it is not allowed to move in different file systems.
Example: hadoop fs -mv /user/file1 /user/file2
Hadoop fs -mv hdfs://host:port/file1 hdfs://host:prot/file2

13.-put
usage: hadoop fs -put<localsrc> <dst>
Description: Copy single or multiple source paths from the local file system to the destination file system.
Example: hadoop fs -put localfile /user/file
hadoop fs -put localfile hdfs://host:port/user/file

14.-rm
usage: hadoop fs -rm URI
description: delete the specified file, and requires non-empty directories and files.
Example: hadoop fs -rm hdfs://host:port/file

15.-rmr
Usage: hadoop fs -rmr URI
Description: Recursively delete empty directories in the specified file.

16.-Setrep
usage: hadoop fs -setrep [R] <path>
Description: Change the number of copies of a copy
Example : hadoop fs -setrep -w 3 -R /user/file

17.-Test
usage: hadoop fs - test -[ezd] URI
description: Use ezd to check the
file -e check whether the file exists, if it exists, the return value is 0;
-z check whether the file is 0 bytes, if it is, return 0;
-d check whether the path is Directory, if it is, return 1, otherwise return 0

18.-text
usage: hadoop fs -text <src>
Description: output the source file as text format, the running format is zip and Text class The
above is the command line used in HDFS Operations on files in HDFS are similar to general operation commands, for example,
we copy a file from the local file system to HDFS. You can do this by executing the following command:
$ hadoop fs -copyFromLocal /user/localFile.txt sample.txt This part of the code copies the local file localFile.txt to the localhost running on localhost
by calling the command fs and specifying the execution script command -copyFromLocal
in the HDFS file system.

Guess you like

Origin http://10.200.1.11:23101/article/api/json?id=326641778&siteId=291194637