Centos 7 install wget

1. Detailed wget command:

wget is a tool for downloading files in Linux. wget is an open source software developed under Linux. The author is Hrvoje Niksic, and it was later ported to various platforms including Windows.

It is used on the command line. It is an essential tool for Linux users, especially for network administrators, who often have to download some software or restore backups from remote servers to local servers. If we use a virtual host, we can only download it from the remote server to our computer disk, and then upload it to the server with ftp tools to deal with such transactions. This is a waste of time and energy, which is not impossible. And when it comes to Linux VPS, it can be directly downloaded to the server without uploading. The wget tool is small in size but complete in function. It supports breakpoint download function, supports FTP and HTTP download methods at the same time, supports proxy server and is convenient and simple to set up. Below we illustrate how to use wget in the form of examples.

Install

yum install -y wget

insert image description here

2. View the help manual

wget --help

insert image description here

GNU Wget 1.14, a non-interactive network file downloader.
Usage: wget [options]... [URL]...

Arguments required for long options are also required for short options.

Startup:
-V, --version Display Wget version information and exit.
-h, --help Print this help.
-b, --background Go to the background after startup.
-e, --execute=COMMAND Run a ".wgetrc" style command.

Logging and input files:
-o, --output-file=FILE write logging information to FILE.
-a, --append-output=FILE Append information to FILE.
-d, --debug Print extensive debug information.
-q, --quiet Quiet mode (no information output).
-v, --verbose Verbose output (this is the default).
-nv, --no-verbose Turn off verbose output, but do not enter quiet mode.
--report-speed=TYPE Output bandwidth as TYPE. TYPE can be bits.
-i, --input-file=FILE Download URLs in local or external FILE.
-F, --force-html Treat input files as HTML files.
-B, --base=URL Parse URL-relative
HTML input file (specified by -i -F options).
–config=FILE Specify config file to use.

Download:
-t, --tries=NUMBER Set the number of retries to NUMBER (0 means unlimited).
--retry-connrefused Retry even if the connection is refused.
-O, --output-document=FILE Write document to FILE.
-nc, --no-clobber skip downloads that would download to
existing files (overwriting them).
-c, --continue resume downloading files.
--progress=TYPE Select the progress bar type.
-N, --timestamping only get files newer than local files.
--no-use-server-timestamps Do not use timestamps on the server for local files.
-S, --server-response Print server response.
--spider does not download any files.
-T, --timeout=SECONDS Set all timeouts to SECONDS seconds.
--dns-timeout=SECS Set DNS lookup timeout to SECS seconds.
--connect-timeout=SECS Set connection timeout to SECS seconds.
--read-timeout=SECS Set read timeout to SECS seconds.
-w, --wait=SECONDS wait for SECONDS seconds.
--waitretry=SECONDS Wait 1…SECONDS seconds between retries to get files.
--random-wait When fetching multiple files, each random wait interval is
0.5 WAIT…1.5 WAIT seconds.
--no-proxy disables the use of proxies.
-Q, --quota=NUMBER Set fetch quota to NUMBER bytes.
--bind-address=ADDRESS bind to ADDRESS (hostname or IP) on localhost.
--limit-rate=RATE Limit download rate to RATE.
--no-dns-cache Turn off caching of DNS lookups.
--restrict-file-names=OS Restrict characters in filenames to those allowed by the OS.
--ignore-case Ignore case when matching files/directories.
-4, --inet4-only Only connect to IPv4 addresses.
-6, --inet6-only Only connect to IPv6 addresses.
--prefer-family=FAMILY First connect to the address of the specified protocol.
FAMILY is IPv6, IPv4 or none.
--user=USER Set username to USER for both ftp and http.
--password=PASS Set the password for both ftp and http to PASS.
--ask-password Prompt for a password.
--no-iri Turn off IRI support.
--local-encoding=ENC IRI (Internationalized Resource Identifier) ​​Use ENC as local encoding.
--remote-encoding=ENC Use ENC as the default remote encoding.
--unlink remove file before clobber.

Directories:
-nd, --no-directories Do not create directories.
-x, --force-directories Force creation of directories.
-nH, --no-host-directories Do not create home directories.
--protocol-directories Use protocol names in directories.
-P, --directory-prefix=PREFIX save file as PREFIX/...
--cut-dirs=NUMBER ignore NUMBER directory levels in remote directory.

HTTP options:
--http-user=USER Set the http user name to USER.
--http-password=PASS Set http password to PASS.
--no-cache Do not cache data on the server.
--default-page=NAME Change default page
(default page is usually "index.html").
-E, --adjust-extension Save HTML/CSS document with appropriate extension.
--ignore-length Ignore the 'Content-Length' field of the header.
--header=STRING Insert STRING in the header.
--max-redirect Maximum redirection allowed per page.
--proxy-user=USER Use USER as proxy username.
--proxy-password=PASS Use PASS as proxy password.
--referer=URL Include 'Referer: URL' in the HTTP request header.
--save-headers Save HTTP headers to a file.
-U, --user-agent=AGENT Flag as AGENT instead of Wget/VERSION.
--no-http-keep-alive disable HTTP keep-alive (permanent connection).
--no-cookies do not use cookies.
--load-cookies=FILE Load cookies from FILE before the session starts.
--save-cookies=FILE save cookies to FILE after session ends.
--keep-session-cookies load and save session (non-persistent) cookies.
--post-data=STRING Use POST method; send STRING as data.
--post-file=FILE Use POST method; send FILE content.
--content-disposition Allow Content-Disposition header when local filename is selected
(experimental).
--content-on-error output the received content on server errors.
--auth-no-challenge Send first-wait
basic HTTP authentication message without server challenge.

HTTPS (SSL/TLS) options:
--secure-protocol=PR choose secure protocol, one of auto, SSLv2,
SSLv3, TLSv1, TLSv1_1 and TLSv1_2.
--no-check-certificate Do not verify the server's certificate.
--certificate=FILE Client certificate file.
--certificate-type=TYPE client certificate type, PEM or DER.
--private-key=FILE private key file.
--private-key-type=TYPE private key file type, PEM or DER.
--ca-certificate=FILE File with set of CA certificates.
--ca-directory=DIR Directory to keep hash list of CA certificates.
--random-file=FILE File with random data for SSL PRNG generation.
--egd-file=FILE File to name EGD sockets with random data.

FTP options:
–ftp-user=USER Set the ftp user name to USER.
--ftp-password=PASS Set ftp password to PASS.
--no-remove-listing Do not remove '.listing' files.
--no-glob Do not use wildcard expansion in FTP filenames.
--no-passive-ftp Disable "passive" transfer mode.
--preserve-permissions preserve remote file permissions.
--retr-symlinks When recursing directories, get linked files (not directories).

WARC options:
–warc-file=FILENAME save request/response data to a .warc.gz file.
–warc-header=STRING insert STRING into the warcinfo record.
–warc-max-size=NUMBER set maximum size of WARC files to NUMBER.
–warc-cdx write CDX index files.
–warc-dedup=FILENAME do not store records listed in this CDX file.
–no-warc-compression do not compress WARC files with GZIP.
–no-warc-digests do not calculate SHA1 digests.
–no-warc-keep-log do not store the log file in a WARC record.
–warc-tempdir=DIRECTORY location for temporary files created by the
WARC writer.

Recursive download:
-r, --recursive specifies recursive download.
-l, --level=NUMBER Maximum recursion depth (inf or 0 means unlimited, that is, all downloads).
--delete-after Delete local files after download is complete.
-k, --convert-links Make links in downloaded HTML or CSS point to local files.
--backups=N before writing file X, rotate up to N backup files.
-K, --backup-converted Backup file X as X.orig before converting it.
Short form for -m, --mirror -N -r -l inf --no-remove-listing.
-p, --page-requisites Download all elements such as images used to display HTML pages.
--strict-comments Process HTML comments in strict mode (SGML).

Accept/reject recursively:
-A, --accept=LIST Comma-separated list of acceptable extensions.
-R, --reject=LIST Comma-separated list of extensions to reject.
--accept-regex=REGEX regex matching accepted URLs.
--reject-regex=REGEX regex matching rejected URLs.
--regex-type=TYPE regex type (posix|pcre).
-D, --domains=LIST comma separated acceptable URLs list of domains.
--exclude-domains=LIST Comma separated list of domains to exclude.
--follow-ftp Follow FTP links in HTML documents.
--follow-tags=LIST Comma-separated list of HTML tags to follow.
--ignore-tags=LIST Comma separated list of HTML tags to ignore.
-H, --span-hosts Span to external hosts when recursing.
-L, --relative Only follow relative links.
-I, --include-directories=LIST List of allowed directories.
--trust-server-names use the name specified by the redirection
url last component.
-X, --exclude-directories=LIST List of excluded directories.
-np, --no-parent Do not trace back to parent directories.

3. Use wget to download a single file

The following example downloads a file from the network and saves it in the current directory

During the download process, a progress bar will be displayed, including (download completion percentage, downloaded bytes, current download speed, remaining download time).

wget http://cn.wordpress.org/wordpress-4.9.4-zh_CN.tar.gz

4. Use wget -O to download and save with a different filename

1 [root@network test]# wget https://cn.wordpress.org/wordpress-4.9.4-zh_CN.tar.gz
2 [root@network test]# ls
3 wordpress-4.9.4-zh_CN.tar.gz

We can use the parameter -O to specify a filename:

1 wget -O wordpress.tar.gz http://cn.wordpress.org/wordpress-4.9.4-zh_CN.tar.gz
2 wordpress.tar.gz

5. Use wget -c to continue uploading

Use wget -c to restart downloading interrupted files:

It is very helpful for us to download large files suddenly due to interruption due to network and other reasons. We can continue to download instead of re-downloading a file

wget -c https://cn.wordpress.org/wordpress-4.9.4-zh_CN.tar.gz

6. Use wget -b to download in the background

For downloading very large files, we can use the parameter -b to download in the background

1 [root@network test]# wget -b https://cn.wordpress.org/wordpress-4.9.4-zh_CN.tar.gz
2 Continue to run in the background, pid is 1463.
3 will write the output to "wget-log".

Guess you like

Origin blog.csdn.net/u014212540/article/details/127887433