wget command introduction and summary

  References:

  https://www.cnblogs.com/ftl1012/p/9265699.html

  https://www.cnblogs.com/lsdb/p/7171779.html

  wget curl and are commonly used in Linux download tool, except that, curl can customize request parameter, so in terms of better at analog web request; wget since the support ftp and recursive (recursive This) download, the download documentation better at. Analogy, then curl like browser, wget is Thunder.

  Total net experiment because of contact with wget, although has also been heard before but they were only aware of this download tool, I do not know how to use, what function. The following summarizes what wget commonly used commands learned for future access, in-depth learning when there is a demand.

  Firstly, the most practical of several commands:

wget -c + URL or file name support download resuming the #
wget -b + URL or file name of the downloaded in the background #
wget -i list # list of predefined file URL, the URL corresponding download list
wget -x -r -A "* .txt" + URL general all directories and files in the download directory to the target directory # recursive, and create the corresponding file in the local directory
wget -nd -r -A "* .txt" + URL general all directories and files in the download directory to the target directory # recursive, does not create the corresponding file in the local directory
wget -x -r -R "* .txt" + URL # recursive directory usually download files except a suffix, and create a directory structure
wget -nd -r -R "* .txt" + URL # recursive directory usually download files except a suffix, does not create the directory structure
wget -m + URL Catalog # generally create a mirror that completely copy

  Here add the concept of recursive downloads, this is a very important feature. We often hear recursive search recursive download. Recursive here refers to the current directory and all subdirectories. If you do not use a recursive downloads, only download files in the current folder, ignoring the directory of the current folder. That subdirectories and files in a subdirectory will not be downloaded.

  The following lists some of the common parameters wget command:

parameter

Parameter Meaning

‐‐help

Wget displays online help, this table gives only use part of the parameters, the parameters in more detail, please consult the online help

‐A

Acceptance of only the specified file types, such as -A "* .gif" will download only gif images, if there are multiple allowed to use "," separate

‐b

Let wget runs in the background to write the log file in the current directory "wget-log" file

‐t [nuber of times]

When wget can not establish a connection to the server, try to connect many times. For example, "-t 120" represents a try 120 times. When this one is "0" when trying to specify an infinite repeatedly until a successful connection

‐c

HTTP, this is a very useful setting, especially when downloading large files when, if she interrupted unexpectedly, then the time will then pass the connection is restored from the last pass did not finish, rather than start again from scratch

‐T [number of sec]

overtime time. The "-T 120" indicates after 120 seconds as the remote server is not sent, data, try to connect again. If the network is faster, this time can be set shorter

‐w [number of seco]

In how many seconds to wait between two attempts, such as "-w 100" represents waiting 100 seconds between attempts

‐Q [byetes]

Limit download file size can not exceed the total number, such as "-Q2k" represents no more than 2K bytes, "- Q3m" represents no more than 3M bytes

‐nd

Do not download the directory structure, downloaded from the server all piled into the specified directory files in the current directory

‐x

Set the "-nd" On the contrary, such as "wget ​​-x http: // abc" will be created to create "abc" subdirectory in the current directory, and then go on to build server directory structure in accordance with a level until all files are pass complete

‐nH

Do not create a directory to the target host domain name of the directory, the directory structure of the target host directly down to the current directory

‐‐http‐user=xxx

If the Web server needs to specify a user name and password, with both set

‐‐http‐passwd=xxx

 

‐i download_list

All URL to download the file "download_list" listed

‐k

Convert to link local connection

‐‐proxy‐user=xxx

If the proxy server requires a username and password, use these two options

‐‐proxy‐passwd=xxx

 

‐r

‐‐recursive

specify recursive download

‐R

He refused to specify the file type, such as -R "* .gif" will not download gif image, if there are multiple does not allow, you can use "," separate

‐l [depth]

The depth of the remote server download directory structure, for example, "-l 5" is less than or equal to the depth of the download directory structure or a file directory within 5

‐m

Options when doing site mirroring, if you want to be a mirror site

-eg

Just download the target site specified directory and its subdirectories

 

Guess you like

Origin www.cnblogs.com/chester-cs/p/11762575.html