Basic syntax:
hadoop fs -put < local file > < hdfs file >
The parent directory of the hdfs file must exist, otherwise the command will not be executed
hadoop fs -put < local file or dir >...< hdfs dir >
hdfs dir must exist, otherwise the command will not be executed
hadoop fs -put - < hdsf file>
Read the input from the keyboard into the hdfs file, press Ctrl+D to end the input, the hdfs file cannot exist, otherwise the command will not be executed
Test it yourself:Basic syntax:
hadoop fs -rm < hdfs file > ...
hadoop fs -rm -r < hdfs dir>...
Multiple files or directories can be deleted at a time
Test it yourself:hadoop fs -get < hdfs file > < local file or dir>
The local file cannot have the same name as the hdfs file, otherwise it will prompt that the file already exists, and the file without the same name will be copied to the local
hadoop fs -get < hdfs file or dir > ... < local dir >
When copying multiple files or directories to the local, the local should be the folder path
. Note: if the user is not root, the local path should be the path under the user's folder, otherwise there will be permission problems.
Test it yourself:
Copy the file to your own directory:
hdfs dfs -get hdfs:///user/hive/lib/ sab_risk_parse .py
ref:
1. https://segmentfault.com/a/1190000002672666