[Shell-HDFS] Use Shell script to determine whether HDFS files and directories exist

1) Purpose of document writing

This article mainly describes how to use Shell script to judge whether HDFS files or directories exist.

Shell script test:

1. Test whether the path exists.

2. Test whether the directory exists.

3. Test whether the file exists.

4. Test whether the path size is greater than 0.

5. Test whether the path size is equal to 0.

2) Test principle

To hadoop fs -testdetermine whether a file or directory exists, only one test parameter can be passed at a time.

Usage: hadoop fs -test -[defsz] URI
-test: Only one test flag is allowed
Options:
-d: if the path is a directory, return 0.
-e: if the path exists, return 0.
-f: if the path is a file, return 0.
-s: if the path is not empty, return 0.
-z: if the file is zero length, return 0.

Parameter Description

  • -d: returns 0 if path is a directory
  • -e: returns 0 if the path exists
  • -f: If the path is a file, return 0
  • -s: returns 0 if the size of the file is greater than 0 bytes
  • -z: returns 0 if the size of the file is 0, else returns 1

3) Shell script test

Prepare a test directory, containing a file and a folder.

[root@bigdata60 ~]# hadoop fs -ls  /user/wdtest
Found 2 items
drwxr-xr-x   - fusionuser wdtest          0 2018-07-03 13:26  /user/wdtest/.fusion
-rw-r--r--   3 fusionuser wdtest        908 2018-07-06 22:37  /user/wdtest/hosts

3.1. Test whether the path exists

1. Write the test script as follows.

#!/bin/sh

path=$1

hdfs dfs -test -e ${path}
if [ $? -eq 0 ]; then
   echo "Path is exist!"
else
   echo "Path is not exist!"
fi

2. Execute ./etest.sh /user/wdtestthe command, run the test script, and verify the output.

insert image description here

3.2. Test whether the directory exists

1. Write the test script as follows.

#!/bin/bash

path=$1

hdfs dfs -test -d ${path}
if [ $? -eq 0 ]; then
  echo "Directory is exist!"
else 
  echo "Directory is not exist!"
fi

2. Execute ./dtest.sh /user/wdtestthe command and ./dtest.sh /user/wdtest/aacommand respectively, run the test script, and verify the output results.

insert image description here

3.3. Test whether the file exists

1. Write the test script as follows.

#!/bin/sh

path=$1

path=$1

hdfs dfs -test -f ${path}
if [ $? -eq 0 ]; then
   echo "File is exist!"
else
   echo "File is not exist!"
fi

2. Execute ./ftest.sh /user/wdtest/hoststhe command and ./dtest.sh /user/wdtest/hostcommand respectively, run the test script, and verify the output results.

insert image description here

3.4. Test whether the path size is greater than 0

#!/bin/sh

path=$1

hdfs dfs -test -s ${path}
if [ $? -eq 0 ]; then
  echo "Is greater than zero bytes in size!"
else 
  echo "Is not greater than zero bytes in size!"
fi

3.5. Test whether the path size is equal to 0

#!/bin/sh

path=$1

hdfs dfs -test -z ${path}
if [ $? -eq 0 ]; then
  echo "Is zero bytes in size!"
else 
  echo "Is not zero bytes in size!"
fi

4) Summary

1. hadoop fs -test -[defsz] URIYou can judge whether the HDFS file or directory exists by passing . -test can only pass one test parameter at a time, and cannot execute commands like hadoop fs -test -dfURI or hadoop fs -test -d -f URI ;

2. hadoop fs -test -[defsz] URIThe return value is not output, you need to “$?”get the return value through this special variable;

3. After judging whether the HDFS file directory exists, you can perform operations such as creating and deleting directories or files. Of course, there are some more advanced gameplays, which will not be demonstrated here.

Guess you like

Origin blog.csdn.net/weixin_53543905/article/details/130237460