Download file linux command

wget command

wget command to download a file from the specified URL. wget is very stable and unstable network, it has a strong adaptability in bandwidth is very narrow circumstances, if the download fails due to network, wget will keep trying until the entire file is downloaded. If the download process is interrupted by a server, it will connect to the server again to continue downloading from where it left off. This is useful for defining those downloaded from the link-time server large file.

grammar

wget (option) (parameters)

Options

-a <log file>: execution of the specified information recorded in the log file; 
-A <suffix>: Specifies the extension file to download, separated by commas between the plurality of extension; 
-b: for background the run wget; 
-B <connection address>: setting a reference base address of the connection address; 
-C: task continues execution of the last terminal; 
-C <flag>: server data block setting function activation flag to on, off of closed, the default value is ON; 
-d: debug mode instruction; 
-D <domain name list>: set list of domains along with the domain name between "," separated; 
-e <command>: as a file ".wgetrc" the part of the execution of the specified command; 
-h: displays help information instruction; 
-i <file>: get the URL address to be downloaded from the specified file; 
the -l <directory listing>: set down the list of directories, with multiple directories "," separated; 
-L: only along the associated connector; 
-R & lt: recursively to download; 
- NC : file exists, the file download does not overwrite the original file; 
-nv: displays an error message and update download, no show the detailed process of executing instructions; 
-q: does not display the execution instruction; 
-NH: not query the host name; 
-v: Displays detailed implementation process; 
-V: Display version information; 
--passive- the FTP : FTP passive mode PASV connection server; 
--follow- ftp: FTP connection to download files from HTML files.

parameter

URL: Download the specified URL address.

Examples

Use wget to download a single file

wget http://www.linuxde.net/testfile.zip

The following examples are downloaded from the network and to save a file in the current directory, displays a progress bar in the download process, comprising (percentage of download completed, the bytes have been downloaded, the current download speed, remaining download time).

Download and save the file with a different name

wget -O wordpress.zip http://www.linuxde.net/download.aspx?id=1080

wget will default to the last one in line /behind the character command, usually for the download file name dynamically linked incorrectly.

Error: The following example will download a file name and download.aspx?id=1080save:

wget http://www.linuxde.net/download?id=1

Even if the downloaded file is zip format, it still commands.download.php?id=1080

Correct: To solve this problem, we can use the parameter -Oto specify a file name:

wget -O wordpress.zip http://www.linuxde.net/download.aspx?id=1080

wget download speed

wget --limit-rate=300k http://www.linuxde.net/testfile.zip

When you execute wget, which by default will take all possible broadband download. But when you're ready to download a large file, but you also need to download other files it is necessary to speed up.

HTTP using wget

wget -c http://www.linuxde.net/testfile.zip

Use wget -crestart interrupted downloads file for downloading large files when we suddenly interrupted due to network and other very helpful, then we can continue to download instead of re-downloading a file. It can be used when needed to continue interrupted download -cparameters.

Background downloading using wget

wget -b http://www.linuxde.net/testfile.zip

Continuing in background, pid 1840.
Output will be written to `wget-log'.

For download very large files, we can use the parameters -bfor background downloading, you can use the following command to check the download progress:

tail -f wget-log

Download camouflage agent name

wget --user-agent="Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US) AppleWebKit/534.16 (KHTML, like Gecko) Chrome/10.0.648.204 Safari/534.16" http://www.linuxde.net/testfile.zip

Some sites can refuse your request to download at the discretion of the proxy name instead of the browser. But you can --user-agentdisguise parameters.

Testing download link

When you plan time to download, you should test the download link is valid for a predetermined time. We can increase the --spiderparameters to be checked.

wget --spider URL

If the correct download link will be displayed:

Spider mode enabled. Check if remote file exists.
HTTP request sent, awaiting response... 200 OK
Length: unspecified [text/html]
Remote file exists and could contain further links,
but recursion is disabled -- not retrieving.

This ensures that the download can be performed at the scheduled time, but when you give a wrong link will display the following error:

wget --spider url
Spider mode enabled. Check if remote file exists.
HTTP request sent, awaiting response... 404 Not Found
Remote file does not exist -- broken link!!!

You can use the following circumstances --spiderparameters:

  • Before downloading a timing check
  • The break detect whether the site is available
  • Dead links checking website pages

Increase the number of retries

wget --tries=40 URL

If a network problem or download a large file may also fail. wget default retry connection to download the file 20 times. If necessary, you can use --triesto increase the number of retries.

Download multiple files

wget -i filelist.txt

First, keep a file download link:

cat > filelist.txt
url1
url2
url3
url4

Then use this parameter file and -idownload.

Mirror sites

wget --mirror -p --convert-links -P ./LOCAL URL

Download the entire site to a local.

  • --mirorOpening image download.
  • -pTo download all the files html pages display properly.
  • --convert-linksOnce downloaded, converted to the link cost.
  • -P ./LOCALSave all files and directories to a local specified directory.

Filter specified format download

wget --reject=gif ur

Download a website, but you do not want to download pictures, you can use this command.

To download information into the log file

wget -o download.log URL

You do not want to download information directly but in a log file, you can use the terminal.

The total file size limit download

wget -Q5m -i filelist.txt

When the file you want to download more than 5M exit download, you can use. Note: This parameter does not work on a single file to download, if only recursive downloads effective.

Download the specified file format

wget -r -A.pdf url

You can use this function in the following situations:

  • All the pictures to download a website.
  • All video to download a website.
  • All PDF files downloaded a Web site.

FTP Download

wget ftp-url
wget --ftp-user=USERNAME --ftp-password=PASSWORD url

You can use wget to download the complete ftp links.

Anonymous ftp download using wget:

wget ftp-url

Use wget user name and password authentication ftp download:

wget --ftp-user=USERNAME --ftp-password=PASSWORD url

wget command

wget command to download a file from the specified URL. wget is very stable and unstable network, it has a strong adaptability in bandwidth is very narrow circumstances, if the download fails due to network, wget will keep trying until the entire file is downloaded. If the download process is interrupted by a server, it will connect to the server again to continue downloading from where it left off. This is useful for defining those downloaded from the link-time server large file.

grammar

wget (option) (parameters)

Options

-a <log file>: execution of the specified information recorded in the log file; 
-A <suffix>: Specifies the extension file to download, separated by commas between the plurality of extension; 
-b: for background the run wget; 
-B <connection address>: setting a reference base address of the connection address; 
-C: task continues execution of the last terminal; 
-C <flag>: server data block setting function activation flag to on, off of closed, the default value is ON; 
-d: debug mode instruction; 
-D <domain name list>: set list of domains along with the domain name between "," separated; 
-e <command>: as a file ".wgetrc" the part of the execution of the specified command; 
-h: displays help information instruction; 
-i <file>: get the URL address to be downloaded from the specified file; 
the -l <directory listing>: set down the list of directories, with multiple directories "," separated; 
-L: only along the associated connector; 
-R & lt: recursively to download; 
- NC : file exists, the file download does not overwrite the original file; 
-nv: displays an error message and update download, no show the detailed process of executing instructions; 
-q: does not display the execution instruction; 
-NH: not query the host name; 
-v: Displays detailed implementation process; 
-V: Display version information; 
--passive- the FTP : FTP passive mode PASV connection server; 
--follow- ftp: FTP connection to download files from HTML files.

parameter

URL: Download the specified URL address.

Examples

Use wget to download a single file

wget http://www.linuxde.net/testfile.zip

The following examples are downloaded from the network and to save a file in the current directory, displays a progress bar in the download process, comprising (percentage of download completed, the bytes have been downloaded, the current download speed, remaining download time).

Download and save the file with a different name

wget -O wordpress.zip http://www.linuxde.net/download.aspx?id=1080

wget will default to the last one in line /behind the character command, usually for the download file name dynamically linked incorrectly.

Error: The following example will download a file name and download.aspx?id=1080save:

wget http://www.linuxde.net/download?id=1

Even if the downloaded file is zip format, it still commands.download.php?id=1080

Correct: To solve this problem, we can use the parameter -Oto specify a file name:

wget -O wordpress.zip http://www.linuxde.net/download.aspx?id=1080

wget download speed

wget --limit-rate=300k http://www.linuxde.net/testfile.zip

When you execute wget, which by default will take all possible broadband download. But when you're ready to download a large file, but you also need to download other files it is necessary to speed up.

HTTP using wget

wget -c http://www.linuxde.net/testfile.zip

Use wget -crestart interrupted downloads file for downloading large files when we suddenly interrupted due to network and other very helpful, then we can continue to download instead of re-downloading a file. It can be used when needed to continue interrupted download -cparameters.

Background downloading using wget

wget -b http://www.linuxde.net/testfile.zip

Continuing in background, pid 1840.
Output will be written to `wget-log'.

For download very large files, we can use the parameters -bfor background downloading, you can use the following command to check the download progress:

tail -f wget-log

Download camouflage agent name

wget --user-agent="Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US) AppleWebKit/534.16 (KHTML, like Gecko) Chrome/10.0.648.204 Safari/534.16" http://www.linuxde.net/testfile.zip

Some sites can refuse your request to download at the discretion of the proxy name instead of the browser. But you can --user-agentdisguise parameters.

Testing download link

When you plan time to download, you should test the download link is valid for a predetermined time. We can increase the --spiderparameters to be checked.

wget --spider URL

If the correct download link will be displayed:

Spider mode enabled. Check if remote file exists.
HTTP request sent, awaiting response... 200 OK
Length: unspecified [text/html]
Remote file exists and could contain further links,
but recursion is disabled -- not retrieving.

This ensures that the download can be performed at the scheduled time, but when you give a wrong link will display the following error:

wget --spider url
Spider mode enabled. Check if remote file exists.
HTTP request sent, awaiting response... 404 Not Found
Remote file does not exist -- broken link!!!

You can use the following circumstances --spiderparameters:

  • Before downloading a timing check
  • The break detect whether the site is available
  • Dead links checking website pages

Increase the number of retries

wget --tries=40 URL

If a network problem or download a large file may also fail. wget default retry connection to download the file 20 times. If necessary, you can use --triesto increase the number of retries.

Download multiple files

wget -i filelist.txt

First, keep a file download link:

cat > filelist.txt
url1
url2
url3
url4

Then use this parameter file and -idownload.

Mirror sites

wget --mirror -p --convert-links -P ./LOCAL URL

Download the entire site to a local.

  • --mirorOpening image download.
  • -pTo download all the files html pages display properly.
  • --convert-linksOnce downloaded, converted to the link cost.
  • -P ./LOCALSave all files and directories to a local specified directory.

Filter specified format download

wget --reject=gif ur

Download a website, but you do not want to download pictures, you can use this command.

To download information into the log file

wget -o download.log URL

You do not want to download information directly but in a log file, you can use the terminal.

The total file size limit download

wget -Q5m -i filelist.txt

When the file you want to download more than 5M exit download, you can use. Note: This parameter does not work on a single file to download, if only recursive downloads effective.

Download the specified file format

wget -r -A.pdf url

You can use this function in the following situations:

  • All the pictures to download a website.
  • All video to download a website.
  • All PDF files downloaded a Web site.

FTP Download

wget ftp-url
wget --ftp-user=USERNAME --ftp-password=PASSWORD url

You can use wget to download the complete ftp links.

Anonymous ftp download using wget:

wget ftp-url

Use wget user name and password authentication ftp download:

wget --ftp-user=USERNAME --ftp-password=PASSWORD url

Guess you like

Origin www.cnblogs.com/wsy0202/p/12484496.html