Detailed explanation of Linux curl command

Command: curl

In Linux, curl is a file transfer tool that uses URL rules to work under the command line. It can be said to be a very powerful http command line tool. It supports file uploading and downloading, and is a comprehensive transmission tool, but traditionally, it is customary to call url a download tool.

Syntax: # curl [option] [url]

Common parameters:

copy code
-A/--user-agent <string> Set user agent to send to server
-b/--cookie <name=string/file> cookie string or file read location
-c/--cookie-jar <file> Write cookies to this file after operation
-C/--continue-at <offset> breakpoint continue
-D/--dump-header <file> write header information to this file
-e/--referer source URL
-f/--fail do not show http errors when connection fails
-o/--output write output to this file
-O/--remote-name write output to this file, keeping the filename of the remote file
-r/--range <range> Retrieve byte range from HTTP/1.1 or FTP server
-s/--silent silent mode. don't output anything
-T/--upload-file <file> upload file
-u/--user <user[:password]> set server user and password
-w/--write-out [format] what to output after completion
-x/--proxy <host[:port]> use HTTP proxy on given port
-#/--progress-bar progress bar showing current delivery status
copy code

Example:
1. Basic usage

# curl http://www.linux.com

After execution, the html of www.linux.com will be displayed on the screen.
Ps: Since the desktop is often not installed when installing linux, which means that there is no browser, this method is often used to test a server Is it possible to reach a website

2. Save the visited webpage
2.1: Use the redirection function of linux to save

# curl http://www.linux.com >> linux.html

2.2: You can use curl's built-in option: -o (lowercase) to save web pages

$ curl -o linux.html http://www.linux.com

After the execution is completed, the following interface will be displayed. If 100% is displayed, it means the save is successful.

% Total    % Received % Xferd  Average Speed  Time    Time    Time  Current
                                Dload  Upload  Total  Spent    Left  Speed
100 79684 0 79684 0 0 3437k 0 --:--:-- --:--:-- --:--:-- 7781k

2.3: You can use curl's built-in option: -O (uppercase)
to save the files in the web page. Note that the url behind here should be specific to a certain file, otherwise it will not be caught.

# curl -O http://www.linux.com/hello.sh

3. Test the return value of the webpage

# curl -o /dev/null -s -w %{http_code} www.linux.com

Ps: In scripts, this is a very common use to test whether a website is normal

4. Specify the proxy server and its port.
Many times, you need to use a proxy server to access the Internet (for example, when you use a proxy server to access the Internet or when the IP address is blocked by others because you use curl to other websites). Fortunately, curl uses the built-in option: -x to support setting proxy

# curl -x 192.168.100.100:1080 http://www.linux.com

5. Cookies
Some websites use cookies to record session information. For browsers like chrome, cookie information can be easily handled, but it can be easily handled by adding relevant parameters in curl.
5.1: Save the cookie information in the response of http. Built-in option: -c (lowercase)

# curl -c cookiec.txt  http://www.linux.com

After execution, the cookie information is stored in cookiec.txt

5.2: Save the header information in the http response. Built-in option: -D

# curl -D cookied.txt http://www.linux.com

After execution, the cookie information is stored in cookied.txt

Note: The cookie generated by -c (lowercase) is different from the cookie in -D.


5.3: Using cookies
Many websites monitor your cookie information to determine whether you visit their websites in accordance with the rules, so we need to use the saved cookie information. Built-in option: -b

# curl -b cookiec.txt http://www.linux.com

6. Mimic browsers
Some websites require a specific browser to access them, and some require a specific version. curl built-in option:-A allows us to specify the browser to visit the website

# curl -A "Mozilla/4.0 (compatible; MSIE 8.0; Windows NT 5.0)" http://www.linux.com

In this way, the server side will think that it is accessed using IE8.0

7. Fake referer (hotlink)
Many servers will check the referer accessed by http to control access. For example, if you visit the home page first, and then visit the mailbox page on the home page, the referer address for accessing the mailbox here is the page address after accessing the home page successfully. It's a
hacked built-in option in curl: -e allows us to set the referer

# curl -e "www.linux.com" http://mail.linux.com

This will make the server think you are clicking on a link from www.linux.com

8. Download files
8.1: Use curl to download files.
#Use built-in option: -o (lowercase)

# curl -o dodo1.jpg http:www.linux.com/dodo1.JPG

#Use built-in option: -O (uppercase)

# curl -O http://www.linux.com/dodo1.JPG

This will save the file locally with the name on the server

8.2: Cyclic download
Sometimes the downloaded pictures may have the same name as the previous part, but the last tail name is different

# curl -O http://www.linux.com/dodo[1-5].JPG

This will save all dodo1, dodo2, dodo3, dodo4, and dodo5

8.3: Download Rename

# curl -O http://www.linux.com/{hello,bb}/dodo[1-5].JPG

Because the file names in the downloaded hello and bb are dodo1, dodo2, dodo3, dodo4, dodo5. Therefore, the second download will overwrite the first download, so the file needs to be renamed.

#curl -o # 1_ # 2.JPG http://www.linux.com/ {hello, bb} / dodo [1-5] .JPG

In this way, the downloaded file of hello/dodo1.JPG will become hello_dodo1.JPG, other files and so on, thus effectively avoiding the file being overwritten

8.4: Download in chunks
Sometimes the downloaded things will be relatively large. At this time, we can download in chunks. Use built-in option: -r

# curl -r 0-100 -o dodo1_part1.JPG http://www.linux.com/dodo1.JPG
# curl -r 100-200 -o dodo1_part2.JPG http://www.linux.com/dodo1.JPG
# curl -r 200- -o dodo1_part3.JPG http://www.linux.com/dodo1.JPG
# cat dodo1_part* > dodo1.JPG

So you can view the content of dodo1.JPG

8.5: Download files
through ftp curl can download files through ftp, curl provides two syntaxes for downloading from ftp

# curl -O -u username:password ftp://www.linux.com/dodo1.JPG
# curl -O ftp://username:[email protected]/dodo1.JPG

8.6: Show download progress bar

# curl -# -O http://www.linux.com/dodo1.JPG

8.7: Download progress information will not be displayed

# curl -s -O http://www.linux.com/dodo1.JPG

9. Resume from a breakpoint
In Windows, we can use software such as Thunder to resume the upload from a breakpoint. Curl can also achieve the same effect through the built-in option:-C.
If the connection is suddenly dropped during the process of downloading dodo1.JPG, you can use the following methods to resume the upload

# curl -C -O http://www.linux.com/dodo1.JPG

10. Uploading files
curl can not only download files, but also upload files. Implemented by built-in option:-T

# curl -T dodo1.JPG -u username:password ftp://www.linux.com/img/

This uploads the file dodo1.JPG to the ftp server

11. Display crawl errors

# curl -f http://www.linux.com/error

Other parameters (translated as reproduced here):

copy code
-a/--append append to target file when uploading file
--anyauth can use "any" authentication method
--basic use HTTP basic authentication
-B/--use-ascii use ASCII text transfer
-d/--data <data> Send data by HTTP POST
--data-ascii <data> Post data in ascii
--data-binary <data> Post data in binary
--negotiate use HTTP authentication
--digest use digital authentication
--disable-eprt disable use of EPRT or LPRT
--disable-epsv disable use of EPSV
--egd-file <file> Set EGD socket path for random data (SSL)
--tcp-nodelay use TCP_NODELAY option
-E/--cert <cert[:passwd]> Client certificate file and password (SSL)
--cert-type <type> Certificate file type (DER/PEM/ENG) (SSL)
--key <key> Private key filename (SSL)
--key-type <type> Private key file type (DER/PEM/ENG) (SSL)
--pass <pass> private key password (SSL)
--engine <eng> Encryption engine to use (SSL). "--engine list" for list
--cacert <file> CA certificate (SSL)
--capath <directory>            CA目   (made using c_rehash) to verify peer against (SSL)
--ciphers <list> SSL ciphers
--compressed requires the return to be compressed (using deflate or gzip)
--connect-timeout <seconds> set maximum request time
--create-dirs build directory hierarchy of local directories
--crlf upload is to convert LF to CRLF
--ftp-create-dirs Create remote directories if they don't exist
--ftp-method [multicwd/nocwd/singlecwd] Control the use of CWD
--ftp-pasv use PASV/EPSV instead of port
--ftp-skip-pasv-ip When using PASV, ignore the IP address
--ftp-ssl Attempt to use SSL/TLS for ftp data transfer
--ftp-ssl-reqd require SSL/TLS for ftp data transfer
-F/--form <name=content> Simulate http form submission data
-form-string <name=string> Simulate http form submission data
-g/--globoff disable use of {} and [] for url sequences and ranges
-G/--get send data as get
-h/--help help
-H/--header <line> Custom header information passed to the server
--ignore-content-length length of HTTP headers to ignore
-i/--include include protocol header information when outputting
-I/--head only show document information
-j/--junk-session-cookies ignore session cookies when reading files
--interface <interface> use the specified network interface/address
--krb4 <level> use krb4 with specified security level
-k/--insecure allow SSL sites without certificates
-K/--config The specified configuration file is read
-l/--list-only list the file names in the ftp directory
--limit-rate <rate> set transfer speed
--local-port<NUM> Force local port number
-m/--max-time <seconds> set maximum transfer time
--max-redirs <num> Set the maximum number of directories to read
--max-filesize <bytes> Set the maximum amount of files downloaded
-M/--manual show full manual
-n/--netrc Read username and password from netrc file
--netrc-optional use .netrc or URL to override -n
--ntlm use HTTP NTLM authentication
-N/--no-buffer disable buffered output
-p/--proxytunnel use HTTP proxy
--proxy-anyauth choose any proxy authentication method
--proxy-basic Use basic authentication on proxy
--proxy-digest use digital authentication on proxy
--proxy-ntlm use ntlm authentication on proxy
-P/--ftp-port <address> use port address instead of PASV
-Q/--quote <cmd> Send command to server before file transfer
--range-file read (SSL) random file
-R/--remote-time keep remote file time when generating files locally
--retry <num> The number of times to retry when there is a problem with the transmission
--retry-delay <seconds> Set the retry interval when there is a problem with the transmission
--retry-max-time <seconds> Set the maximum retry time when there is a problem with the transmission
-S/--show-error show errors
--socks4 <host[:port]> proxy the given host and port with socks4
--socks5 <host[:port]> proxy the given host and port with socks5
-t/--telnet-option <OPT=val> Telnet option setting
--trace <file> debug the specified file
--trace-ascii <file> Like -- trace but no hex output
--trace-time add timestamp when tracing/verbose output
--url <URL>                    Spet URL to work with
-U/--proxy-user <user[:password]> Set proxy username and password
-V/--version display version information
-X/--request <command> what command to specify
-y/--speed-time Time to give up speed limit. Default is 30
-Y/--speed-limit stop transmission speed limit, speed time' seconds
-z/--time-cond transfer time settings
-0/--http1.0 use HTTP 1.0
-1/--tlsv1 use TLSv1 (SSL)
-2/--sslv2 use SSLv2 (SSL)
-3/--sslv3 SSLv3 (SSL) used
--3p-quote                      like -Q for the source URL for 3rd party transfer
--3p-url use url for third-party delivery
--3p-user use username and password for third-party transfer
-4/--ipv4 use IP4
-6/--ipv6 use IP6

Guess you like

Origin http://43.154.161.224:23101/article/api/json?id=325113065&siteId=291194637