Hadoop Common Commands

Hadoop common command examples

1. View the contents of the specified directory: hadoop fs –ls [file directory]
[root@cdh01 tmp]# hadoop fs -ls -h /tmp
Found 2 items
drwxrwxrwx - hdfs supergroup 0 2016-01-21 10:24 /tmp/ .cloudera_health_monitoring_canary_files
drwx-wx-wx - hive supergroup 0 2016-01-21 10:02 /tmp/hive
[root@cdh01 tmp]# hadoop fs -ls -h /
Found 2 items
drwxrwxrwx - hdfs supergroup 0 2016-01-21 10:02 /tmp
drwxrwxr-x - hdfs supergroup 0 2016-01-21 10:01 /user

2. Store the local folder on hadoop: hadoop fs –put [local directory] [hadoop directory]
[root@cdh01 / ]# mkdir test_put_dir #Create a directory
[root@cdh01 /]# chown hdfs:hadoop test_put_dir #Assign directory permissions to hadoop user
[root@cdh01 /]# su hdfs #Switch to hadoop user
[hdfs@cdh01 /]$ ls
bin boot dev dfs dfs_bak etc home lib lib64 lost+found media misc mnt net opt ​​proc root sbin selinux srv sys test_put_dir tmp usr var wawa.txt wbwb.txt wyp.txt
[hdfs@cdh01 /]$ hadoop fs -put test_put_dir /
[ hdfs@cdh01 /]$ hadoop fs -ls /
Found 4 items
drwxr-xr-x - hdfs supergroup 0 2016-01-21 11:07 /hff
drwxr-xr-x - hdfs supergroup 0 2016-01-21 15:25 /test_put_dir
drwxrwxrwt - hdfs supergroup 0 2016-01-21 10:39 /tmp
drwxr-xr-x - hdfs supergroup 0 2016-01-21 10:39 /user

3. Create a new directory in the directory specified by hadoop: hadoop fs – mkdir [directory address]
[root@cdh01 /]# su hdfs
[hdfs@cdh01 /]$ hadoop fs -mkdir /hff

4. Create an empty file in the directory specified by hadoop and use the touchz command:
[hdfs@cdh01 /]$ hadoop fs -touchz /test_put_dir/test_new_file.txt
[hdfs@cdh01 /]$ hadoop fs -ls /test_put_dir
Found 1 items
-rw-r--r-- 3 hdfs supergroup 0 2016-01- 21 15:29 /test_put_dir/test_new_file.txt

5. Store local files on hadoop: hadoop fs –put [local address] [hadoop directory]
[hdfs@cdh01 /]$ hadoop fs -put wyp.txt /hff #Direct Directory
[hdfs@cdh01 /]$ hadoop fs -put wyp.txt hdfs://cdh01.cap.com:8020/hff #Server directory
Note: The file wyp.txt is placed in the / root directory, the structure is as follows:
bin dfs_bak lib64 mnt root sys var
boot etc lost+found net sbin test_put_dir wawa2.txt
dev home media opt selinux tmp wbwb.txt
dfs lib misc proc srv usr wyp.txt

6. Open an existing file: hadoop fs –cat [file_path]
[hdfs@cdh01 /]$ hadoop fs -cat /hff/wawa.txt
1 Zhang San male 135
2 Liu Li female 235
3 Wang Wu male 335

7. Put hadoop on Rename a file hadoop fs –mv [old file name] [new file name]
[hdfs@cdh01 /]$ hadoop fs -mv /tmp /tmp_bak #Modify the folder name

8, down a file on hadoop to the local In the existing directory: hadoop fs -get [file directory] [local directory]
[hdfs@cdh01 /]$ hadoop fs -get /hff/wawa.txt /test_put_dir
[hdfs@cdh01 /]$ ls -l /test_put_dir/
total 4
-rw-r--r-- 1 hdfs hdfs 42 Jan 21 15:39 wawa.txt

9. Delete the specified file on hadoop: hadoop fs -rm [file address]
[hdfs@cdh01 /]$ hadoop fs -ls / test_put_dir/
Found 2 items
-rw-r--r-- 3 hdfs supergroup 0 2016-01-21 15:41 /test_put_dir/new2.txt
-rw-r--r-- 3 hdfs supergroup 0 2016-01-21 15:29 /test_put_dir/test_new_file.txt
[hdfs@cdh01/]$ hadoop fs -rm /test_put_dir/new2.txt
16/01/21 15 :42:24 INFO fs.TrashPolicyDefault: Namenode trash configuration: Deletion interval = 1440 minutes, Emptier interval = 0 minutes.
Moved: 'hdfs://cdh01.cap.com:8020/test_put_dir/new2.txt' to trash at: hdfs://cdh01.cap.com:8020/user/hdfs/.Trash/Current
[hdfs@cdh01 /]$ hadoop fs -ls /test_put_dir/
Found 1 items
-rw-r--r-- 3 hdfs supergroup 0 2016-01-21 15:29 /test_put_dir/test_new_file.txt

10. Delete the specified folder (including subdirectories, etc.) on hadoop: hadoop fs –rm -r [directory address]
[hdfs@cdh01 /]$ hadoop fs -rmr /test_put_dir
16/01/21 15:50:59 INFO fs.TrashPolicyDefault: Namenode trash configuration: Deletion interval = 1440 minutes, Emptier interval = 0 minutes.
Moved: 'hdfs://cdh01.cap.com:8020/test_put_dir' to trash at: hdfs://cdh01.cap.com:8020/user/hdfs/.Trash/Current
[hdfs@cdh01 /]$ hadoop fs -ls /
Found 3 items
drwxr-xr-x - hdfs supergroup 0 2016-01- 21 11:07 /hff
drwxrwxrwt - hdfs supergroup 0 2016-01-21 10:39 /tmp
drwxr-xr-x - hdfs supergroup 0 2016-01-21 15:42 /user

11. Save all content in the directory specified by hadoop For a file, down to the local
hadoop dfs –getmerge /user /home/t

12. Kill the running hadoop job
hadoop job –kill [job-id]

Guess you like

Origin http://43.154.161.224:23101/article/api/json?id=324731283&siteId=291194637