Detailed explanation of linux download command wget command

wget is the most commonly used download command in linux. The general method of use is: wget + space + url path of the file to be downloaded

For example: # wget http://www.linuxsense.org/xxxx/xxx.tar.gz

Briefly talk about the -c parameter, this is also very common, you can continue to download from a breakpoint, if you accidentally terminate it, you can continue to use the command to continue downloading

For example: # wget -c http://www.linuxsense.org/xxxx/xxx.tar.gz

The usage of wget is explained in detail below:

wget is a free tool for automatically downloading files from the web. It supports HTTP, HTTPS and FTP protocols and can use HTTP proxy.

The so-called automatic download means that wget can be executed in the background after the user exits the system. This means that you can log into the system, start a wget download task, and then log out of the system, and wget will execute in the background until the task is completed. Big trouble.

wget can follow the links on the HTML page and download them in order to create a local version of the remote server, completely rebuilding the directory structure of the original site. This is often referred to as "recursive downloading". When recursively downloading, wget follows the Robot Exclusion standard (/robots.txt). Wget can convert the link to point to a local file while downloading, so as to facilitate offline browsing.

Wget is very stable, and it has strong adaptability in the case of narrow bandwidth and unstable network. If the download fails due to network reasons, wget will continue to try until the entire file is downloaded. If the server interrupts the download process, it will connect to the server again and continue the download from where it left off. This is useful for downloading large files from servers with limited connection times.

Common uses of wget

The usage format of wget

Usage: wget [OPTION]… [URL]…

* Use wget to do site mirroring:

wget -r -p -np -k http://dsec.pku.edu.cn/~usr_name/

# or

wget -m http://www.tldp.org/LDP/abs/html/

* Download a partially downloaded file on an unstable network, and download during idle periods

wget -t 0 -w 31 -c http://dsec.pku.edu.cn/BBC.avi -o down.log &

# Or read the list of files to download from filelist

wget -t 0 -w 31 -c -B ftp://dsec.pku.edu.cn/linuxsoft -i filelist.txt -o down.log &

The above code can also be used to download when the network is relatively idle. My usage is: in mozilla, copy the URL link that is inconvenient to download at that time into the memory and paste it into the file filelist.txt, and execute the second item of the above code before going out of the system at night.

* Download using proxy

wget -Y on -p -k https://sourceforge.net/projects/wvware/

Proxy can be set in environment variable or wgetrc file

# Set the proxy in the environment variable

export PROXY=http://211.90.168.94:8080/

# Set proxy in ~/.wgetrc

http_proxy = http://proxy.yoyodyne.com:18023/

ftp_proxy = http://proxy.yoyodyne.com:18023/

Classified list of wget various options

* start up

-V, --version display wget version and exit

-h, --help print syntax help

-b, --background switch to background execution after startup

-e, --execute=COMMAND Execute commands in `.wgetrc' format, see /etc/wgetrc or ~/.wgetrc for wgetrc format

* Record and input files

-o, --output-file=FILE write records to FILE file

-a, --append-output=FILE append records to FILE file

-d, --debug print debug output

-q, --quiet quiet mode (no output)

http://www.itqun.net/content-detail/511328.html
http://www.guanwei.org/post/LINUXnotes/05/Linux-Wget-download-method.html

The method of downloading files via HTTP under the LINUX command line
Post by mrchen, 2010-5-23, Views: 101
If the original article is reprinted, please indicate: Reprinted from Guanwei Blog [ http://www.guanwei.org/ ]
Link address of this article: http://www.guanwei.org/post/LINUXnotes/05/Linux-Wget-download-method.html

by the way. If you download files on the ftp server, you can use the ftp command. Then use the get command to download the file

For those who like command-line operations and pursue high-efficiency and high-speed downloads, it is recommended to use the command-line download tool. Command line tools are not only easy to use, but most of them have high download speed and download efficiency, especially suitable for downloading files in large quantities. The following will introduce these tools in detail for you.

Wget

Wget is a very common command-line download tool that is included by default in most Linux distributions. If it is not installed, you can download the latest version at http://www.gnu.org/software/wget/wget.html, and use the following command to compile and install:

    #tar zxvf wget-1.9.1.tar.gz
    #cd wget-1.9.1 #./configure
    #make #make install

Its usage is very simple, Wget uses the following format: #wget [option] [download address]

1. Common parameters of Wget

◆-b: Download in the background, Wget downloads the file to the current directory by default.

◆-O: Download the file to the specified directory.

◆-P: Create a directory with the specified name before saving the file.

◆-t: Number of connection attempts, when Wget cannot establish a connection with the server, how many times to try to connect.

◆-c: Resume upload from a breakpoint. If the download is interrupted, the download will start from the last breakpoint when the connection is restored.

◆-r: use recursive download

In addition to the above common functions, Wget also supports HTTP and FTP proxy functions, just edit its configuration file "/etc/wgetrc". The specific method is to use the VI editor to open the above file, remove the # before "http_proxy" and "ftp_proxoy", and then enter the address of the corresponding proxy server after these two items, save and exit. In addition, Wget can also download the entire website, such as downloading the entire Man manual center. Just enter the following command: #wget -r -p -np -k http://man.chinaunix.net

Among them, the -r parameter means to use recursive download, -p means to download all the files needed to display the complete web page, such as pictures, etc., -np means not to search the upper directory, and -k means to convert absolute links to relative links.

Prozilla

Prozilla is also a very popular command-line download tool, which supports multi-thread download and resumable upload function. You can go to http://prozilla.genesys.ro/ to download the latest 1.3.7.4 installation package. After downloading the installation package, use the following command to install:

    #tar zxvf prozilla-1.3.7.4.tar.gz
    #cd prozilla-1.3.7.4
    #./configure #make
    #make install

The format of the Prozilla command is as follows: #proz [parameter] [download address] The commonly used options are:

◆-k=n : Set n threads to download. If this parameter is not added to specify the number of threads, Prozilla defaults to 4 threads for downloading.

◆-P, --directory-prefix=DIR: Specifies to save the downloaded files in the DIR/ directory.

◆-r, --resume: Continue to download unfinished files. If you want to specify the number of threads to download, you can use the following command: #proz -k=5 http://64.12.204.21/pub/mozilla.org/firefox/releases/1.0/linux-i686/zh-CN/firefox-1.0.installer. tar.gz This will download the file with 5 threads and save the file to the current directory. Like Wget, Prozilla also provides a resume function. After the download is interrupted, re-enter the above command, and a prompt to resume the download will appear. Press the R key to continue the download.

MyGet

MyGet is designed as an extensible multi-threaded download tool with a rich interface, which supports protocols such as HTTP, FTP, HTTPS, MMS, and RTSP. Download its latest version 0.1.0 at http://myget.sourceforge.net/release/myget-0.1.0.tar.bz2, and install it with the following command after downloading:

    #tar jxvf myget-0.1.0.tar.bz2
    #cd myget-0.1.0 #./configure
    #make
    #make install

The format of the MyGet command is as follows: #mytget [option] [download URL] Commonly used options:

◆-d [directory]: Specify the location where the downloaded file is stored locally, and the current directory is the default.

◆-f [file]: Specifies the download file name.

◆-h: help option.

◆-n [Number of threads]: The number of download threads, the default is 4.

◆-x [Proxy server address]: Set the proxy server address, such as "-x http://user:password@host:port". The commonly used form of MyGet is as follows: #mytget -d /root/ -n 10 http://lumaqq.linuxsir.org/download/patch/lumaqq_2004t_patch_2005.07.21.00.00.zip

Linuxdown

Linuxdown is a command-line multi-threaded download tool that supports up to 30-threaded downloads. Download the latest 1.1.0 version at https://gro.clinux.org/frs/download.php/1015/linuxdown-1.0.0.tar.gz. Then use the following command to compile and install:

    #tar zxvf linuxdown-1.1.0.tar.gz
    #cd dandelion/
    #make
    #make install

The format of Linuxdown is: #linuxdown [Download address] [Options] [Number of threads] It should be noted that both the download address and options need to be enclosed in Western quotation marks, and the number of threads cannot exceed 30. A typical download is as follows: #linuxdown "http://lumaqq.linuxsir.org/download/patch/lumaqq_2004t_patch_2005.07.21.00.00.zip" 30

Curl

Curl is also a good command-line download tool under Linux. It is compact and high-speed. The only disadvantage is that it does not support multi-threaded downloads. Download the latest version at http://curl.haxx.se/download/curl-7.14.0.tar.gz. After downloading, you can use the following command to compile and install:

    #tar zxvf curl-7.14.0.tar.gz
    #cd curl-7.14.0/
    #./configure
    #make
    #make test
    #make install

The format of Curl is as follows: #curl [option][download URL] A typical Curl download is as follows: #curl -O http://10.1.27.10/~kennycx/tools/lumaqq_2004-linux_gtk2_x86_with_jre.tar.gz Use Curl to download a file and save it to the current directory. In addition, although Curl does not support multi-threaded downloads, it can download multiple files or a certain part of a file at the same time, which can be achieved by using the following command: #curl -r 0-199 http://www.netscape.com/ to get the file The first 200 bytes. It is also easy to download Curl for the commonly used proxy, the specific operation is as follows: #curl -x 10.1.27.10:1022 ftp://ftp.funet.fi/README Use a proxy server with proxy address 10.1.27.10 and port 1022 to download a document. #curl -U user:passwd -x 10.1.27.10:1022 ftp://ftp.funet.fi/README If the proxy server requires special authentication, you need to enter a legal account and password at user:passwd.

Axel

Axel is a multi-threaded download tool under the command line, which supports resuming uploads from breakpoints, and the speed is usually several times that of Wget. Available for download at http://www.linuxfans.org/nuke/modules.php?name=Site_Downloads&op=mydown&did=1697. After downloading, use the following command to compile and install:

    #tar zxvf axel-1.0a.tar.gz
    #cd axel-1.0a/
    #./configure
    #make
    #make install

The basic usage is as follows: #axel [option] [download directory] [download address] A typical download is as follows: #alex -n 10 -o /home/kennycx/ http://10.1.27.10/~kennycx/tools/lumaqq_2004- linux_gtk2_x86_with_jre.tar.gz uses 10 threads to download the file in the specified path to the /home/kennycx/ directory.

 

Guess you like

Origin blog.csdn.net/weixin_59539033/article/details/127566937