wget command - download network files

The wget command is an abbreviation of the English phrase "web get", which is used to download network files from a specified URL.

The wget command supports common protocols such as HTTP, HTTPS, and FTP, and can directly download network files in the command line.

Unlike the curl command , wget can only be used to download files.

The syntax of the wget command is as follows:

wget [选项] 网址

Common options are as follows:

options role or meaning
-V show version information
-h show help information 
-b Transfer to background execution after startup 
-c Support breakpoint resume 
-O define local filename 
-t N N is an integer, set the number of retries N
-q quiet mode
-i FILENAME The downloaded URL is read from the file FILENAME
-r recursive download
-l The lowercase of L, specifies the maximum recursion depth, 0 or inf means unlimited
 --limit-rate=RATE  Limit download speed up to RATE
--ftp-user=USERNAME Specifies the account name for logging in to the FTP server
--ftp-password=PASSWORD Specifies the account and password for logging in to the FTP server
--I wonder mirror site
-np Do not trace back to the parent directory, commonly used for recursive downloads
-N Only fetch files that are newer than the local file
-P DIRNAME Save the file to the specified directory DIRNAME
--no-check-certificate Do not verify the server's certificate, it is recommended to add this option when downloading from an HTTPS server

Example demonstration:

1. Download the specified file

# 不带选项,下载并按原文件名保存在当前目录
[root@myEuler ~]# wget http://192.168.18.18/k8s/docker-20.10.23.tgz
--2023-03-11 08:36:48--  http://192.168.18.18/k8s/docker-20.10.23.tgz
正在连接 192.168.18.18:80... 已连接。
已发出 HTTP 请求,正在等待回应... 200 OK
长度:65976837 (63M) [application/octet-stream]
正在保存至: “docker-20.10.23.tgz”

……此处省略部分输出……

2. Download and rename the file

# 使用选项-O,将下载的文件重命名为指定的文件名
[root@myEuler ~]# wget -O docker http://192.168.18.18/k8s/docker-20.10.23.tgz
--2023-03-11 08:38:37--  http://192.168.18.18/k8s/docker-20.10.23.tgz
正在连接 192.168.18.18:80... 已连接。
已发出 HTTP 请求,正在等待回应... 200 OK
长度:65976837 (63M) [application/octet-stream]
正在保存至: “docker”
……此处省略部分输出……

3. Speed ​​limit download

# 使用选项--limit=RATE,限制最高下载速率
[root@myEuler ~]# wget --limit=218k -O docker2 http://192.168.18.18/k8s/docker-20.10.23.tgz
--2023-03-11 08:41:16--  http://192.168.18.18/k8s/docker-20.10.23.tgz
正在连接 192.168.18.18:80... 已连接。
已发出 HTTP 请求,正在等待回应... 200 OK
长度:65976837 (63M) [application/octet-stream]
正在保存至: “docker2”

docker2              5%[==>                        ]   3.56M   218KB/s  剩余 4m 39s  

4. Put the download task in the background

[root@myEuler ~]# wget -b http://192.168.18.18/k8s/docker-20.10.23.tgz
继续在后台运行,pid 为 34327。
将把输出写入至 “wget-log”。

5. Breakpoint resume

Breakpoint resume upload is very suitable for downloading large files. If the download is interrupted due to network reasons in the middle of the download, you can use the option -c to resume breakpoint download to avoid downloading from the beginning.

[root@myEuler ~]# wget -c http://192.168.18.18/images/openEuler-22.03-LTS-SP1-everything-x86_64-dvd.iso

6. Download files from FTP server

If the FTP server is anonymous, the download is the same as HTTP. If a user account is required, the two options --ftp-user and --ftp-password must be used.

[root@myEuler ~]# wget --ftp-user=zhangsan --ftp-password=Mima1234! ftp://192.168.218.115/docker
--2023-03-11 09:16:49--  ftp://192.168.218.115/docker
           => “docker”
正在连接 192.168.218.115:21... 已连接。
正在以 zhangsan 登录 ... 登录成功!
……此处省略后续输出……

7. Recursive download

If you want to download all subdirectories and their files under a directory in the site, you can use the -r option, and use the option -l to specify the depth of recursion.

# 安静递归下载,递归深度为6,不追溯父目录,下载的文件保存至data目录
[root@myEuler ~]#  wget -qrl 6 -np -P data/ http://192.168.18.18/images/harbor

8. Mirror site

When you want to mirror a certain site, you can use the option --mirror. This option is actually a recursive download, but the recursion depth is not limited. It is essentially the abbreviation of -N -r -l inf --no-remove-listing form.

[root@myEuler ~]# wget --mirror -P /data/ http://192.168.18.18/images/

However, if the site is a site that contains a large number of Web pages, it is recommended to use the following options:

--convert-links: After downloading the web page, convert the link to a local link
-L: Do not enter other hosts when recursing, because there may be friendly links on the website

9. Read the downloaded URL from the file

# 在文件中指定要下载的URL,每行一个
[root@myEuler ~]# cat download
http://nginx.org/download/nginx-1.22.1.tar.gz
https://webcdn.m.qq.com/spcmgr/download/QQ9.7.3.28946.exe
https://webcdn.m.qq.com/spcmgr/download/WeChatSetup_3.9.0.28.exe

# 由于有HTTPS,故建议加上--no-check-certificate选项,否则可能提示没有认证不允许下载
[root@myEuler ~]# wget -i download --no-check-certificate

Guess you like

Origin blog.csdn.net/u013007181/article/details/129458534