linux system curl command

In Linux curl is a file transfer tool use URL rules work at the command line, you can say that is a very powerful http command-line tool. It supports upload and download files, is integrated transmission tool, but traditionally, used to refer to url to download tools.

Syntax: # curl [option] [url]

Common parameters:

Copy the code
-A / - user-agent <string > user agent to a server provided 
-b / - cookie <name = string / file> cookie string or file reading position 
-c / - cookie-jar <file > Operation after the cookie is written to this file 
-C / - continue-at <offset > breakpoint rpm 
-D / - dump-header <file > writes the header information to the file 
-e / - referer URL 
-f / - fail when an error is not displayed http connection failed 
-o / - output write the output to the file 
-O / - remote-name write the output to the file, the file retains remote file name 
-r / - range <range> 1.1 or retrieved from the FTP server byte ranges the HTTP / 
-s / - silent silent mode. Does not output anything  
-T / - upload-file <file > upload files
-u / - user <user [: password]> Set server user and password 
-w / - after write-out [format] What output completion 
-x / - proxy <host [: port]> given HTTP proxy on the port 
- # / - progress-bar progress bar shows the current state of the transfer
Copy the code

Examples:
1, the basic usage

# curl http://www.linux.com

After the execution, www.linux.com of html will be displayed on the screen
Ps: As the installation of a lot of time when linux is not installed desktop, also it means that no browser, so this method is often used to test a server I can reach a website

2, save the page to access the
2.1: use linux redirection save

# curl http://www.linux.com >> linux.html

2.2: You can use the built-curl the option: -o (lowercase) to save the page

$ Curl -o linux.html http://www.linux.com

The implementation of the following interface will be displayed after the completion of the display of 100% indicates successfully saved

% Total    % Received % Xferd  Average Speed  Time    Time    Time  Current
                                Dload  Upload  Total  Spent    Left  Speed
100 79684    0 79684    0    0  3437k      0 --:--:-- --:--:-- --:--:-- 7781k

2.3: You can use the built-curl the option: -O (uppercase) to save the file in a Web page
should pay attention to the back of the url here should be specific to a file, or else not caught down

# -The curl http://www.linux.com/hello.sh

3, the return value test page

# curl -o /dev/null -s -w %{http_code} www.linux.com

Ps: in the script, this is a very common test site is normal usage

4, specify proxy server and its port
often you need to use the Internet proxy server (such as the Internet or using a proxy server using curl others because the site is shielded IP address when others), fortunately curl by using the built-in option: -x to set the proxy support

# curl -x 192.168.100.100:1080 http://www.linux.com

5, cookie
and some sites are using a cookie to record session information. For such browser chrome, you can easily handle cookie information, but as long as the increase in curl-related parameters also can easily handle cookie
5.1: cookie information stored inside the http response. Built-in option: -c (lowercase)

# curl -c cookiec.txt  http://www.linux.com

After performing the cookie information to be stored inside to cookiec.txt

5.2: Save the header information inside the http response. Built-in option: -D

# curl -D cookied.txt http://www.linux.com

After performing the cookie information to be stored inside to cookied.txt

Note: cookie and -D inside the cookie -c (lowercase) generated is not the same.


5.3: Use cookie
Many sites are monitored by your cookie information to determine whether or not you outlaw visit their website, so we need to use the saved cookie information. Built-in option: -b

# curl -b cookiec.txt http://www.linux.com

6, imitate browser
Some sites require specific browser to access them, and some also require the use of some specific version. curl Built-in option: -A allows us to specify the browser to access the site

# curl -A "Mozilla/4.0 (compatible; MSIE 8.0; Windows NT 5.0)" http://www.linux.com

So the server will consider the use of IE8.0 to visit

7, forgery referer (hotlinking)
many servers will check http referer thus accessible to control access. For example: You are to visit the home page and then access the mailbox page home page and access the referer address mailbox this is after the visit home success page address, if the server finds referer address of the page to access the mailbox is not home address, to conclude that Pirates is even a
curl in the built-in option: -e allows us to set the referer

# curl -e "www.linux.com" http://mail.linux.com

This will allow the server that think you are clicking on a link from www.linux.com over

8, download files
8.1: Using curl to download files.
# Use the built-in option: -o (lowercase)

# curl -o dodo1.jpg http:www.linux.com/dodo1.JPG

# Use the built-in option: -O (uppercase)

# -The curl http://www.linux.com/dodo1.JPG

This will save the file name on the server to the local

8.2: Download cycle
sometimes may be able to download pictures in front of part of the name is the same, it is not the same last name caudal

# -The curl http://www.linux.com/dodo[1-5].JPG

This will put dodo1, dodo2, dodo3, dodo4, dodo5 save all down

8.3: Download Rename

# -The curl http://www.linux.com/{hello,bb}/dodo[1-5].JPG

As the download hello bb in the file name are dodo1, dodo2, dodo3, dodo4, dodo5. So first download the second will cover download, so you need to rename the file.

# Curl -o # 1_ # 2.JPG http://www.linux.com/{hello,bb}/dodo[1-5].JPG

So download hello / dodo1.JPG file down becomes hello_dodo1.JPG, other documents and so on, so as to effectively prevent the files from being overwritten

8.4: Block download
times will be relatively large to download something, this time we can be segmented downloading. Use the built-in option: -r

# curl -r 0-100 -o dodo1_part1.JPG http://www.linux.com/dodo1.JPG
# curl -r 100-200 -o dodo1_part2.JPG http://www.linux.com/dodo1.JPG
# curl -r 200- -o dodo1_part3.JPG http://www.linux.com/dodo1.JPG
# cat dodo1_part* > dodo1.JPG

So that you can view the contents of the dodo1.JPG

8.5: download files via ftp
can download files via ftp, curl offers two download from ftp syntax curl

# Curl -O -u Username: Password ftp://www.linux.com/dodo1.JPG 
# curl -O the FTP: // username: password @ www.linux.com / dodo1.JPG

8.6: show download progress bar

# Curl - # -The http://www.linux.com/dodo1.JPG

8.7: do not show the download progress information

# curl -s -O http://www.linux.com/dodo1.JPG

9, HTTP
in the windows, we can use this software to Thunder HTTP. option can be built-curl: -C also achieve the same effect as
if suddenly dropped in the process of downloading dodo1.JPG, you can resume the following manner

# curl -C -O http://www.linux.com/dodo1.JPG

10, upload files
curl can not download the file, you can also upload files. To achieve -T: built-option

# Curl -T dodo1.JPG -u Username: Password ftp://www.linux.com/img/

This upload files to ftp server dodo1.JPG

11, display crawl errors

# curl -f http://www.linux.com/error

Other parameters (translation is reprinted here):

Copy the code
-a / - append when uploading files, append to the destination file 
--anyauth can use "any" authentication method 
--basic using HTTP Basic Authentication 
-B / - use-ascii Use ASCII text transmission 
-d / - data <data> HTTP pOST mode to transfer data 
--data-ascii <data> in a manner ascii post data 
--data-binary <data> post data in a binary manner 
--negotiate using HTTP authentication 
--digest using digital authentication 
--disable-eprt EPRT prohibited or LPRT 
--disable-EPSV prohibited the EPSV 
--egd-File <File> EGD socket provided random data path (the SSL) 
--tcp-use TCP_NODELAY option NoDelay ™ 
-E / - cert <cert [: passwd]> client certificate file and password (SSL)
--cert-type <type> File Type Certificate (DER / PEM / ENG is) (the SSL) 
--key <Key> private key file name (the SSL) 
--key-type <type> Type private key file (DER / PEM / ENG) (SSL) 
--pass <Pass> private key cryptography (SSL) 
--engine <ENG> encryption engine uses (SSL). "--engine List" for List 
--cacert <File> CA certificates (SSL) 
--capath <directory> CA head (Made a using c_rehash) to the Verify the peer Against (SSL) 
--ciphers <List> SSL password 
--compressed compression is required to return the situation (a using the deflate or gzip) 
--connect-timeout <seconds the >Set the maximum request time 
--create-dirs directory hierarchy established local directory 
--crlf is uploaded into the LF CRLF 
--ftp-the Create-dirs If the remote directory does not exist, create a remote directory
--ftp-method [multicwd / nocwd / singlecwd] CWD control using 
--ftp-pasv use PASV / EPSV place port 
--ftp-skip-pasv-ip PASV used when the IP address is ignored 
--ftp-ssl try using SSL / TLS to ftp data transmission 
--ftp-ssl-reqd requirements for SSL / TLS to ftp data transmission 
-F / - form <name = content > http analog form submission data 
-form-string <name = string> http analog form submission data 
-g / - globoff disable sequence and scope of the URL and the {} [] 
-G / - get get in the way data is transmitted 
-h / - help help 
-H / - header <line> transfer from the header information to the server defined 
--ignore-content-length negligible length HTTP header information 
-i / - protocol comprises header information include output 
-I / - head only document information display
-j / - ignore junk-session-cookies when reading the cookie file the session 
--interface <interface> use the specified network interface / address 
--krb4 <level> with the specified security level krb4 
-k / - in the insecure not allowed to use SSL certificate to the site 
-K / - config specified configuration file read 
-l / - list-only file names listed in the directory ftp 
--limit-rate <rate> set the transmission speed 
--local-port <NUM > enforce local port number 
-m / - max-time <seconds > set the maximum transmission time 
--max-redirs <num> set the maximum number of directory read 
--max-filesize <bytes> set the maximum total downloaded files amount 
-M / - manual display full manual 
-n / - netrc reads the user name and password from the file netrc 
--netrc-optional .netrc or URL used to cover -n
--ntlm the NTLM authentication using HTTP 
-N / - no-buffer output buffer disable 
-p / - proxytunnel HTTP proxy 
--proxy-anyauth select a proxy authentication method according to any 
--proxy-basic agent used in the basic authentication 
--proxy-digest using digital identity authentication on the proxy 
--proxy-ntlm use ntlm authentication on the proxy 
-P / - ftp-port <address > using the port address, instead of using the PASV 
-Q is / - quote <cmd> before file transfer command sent to the server 
--range-file is read (SSL) of the random file 
-R / - remote-time generated in the local file, the remote file retention time 
--retry <num> transmission problems arise when the number of retries 
--retry-delay <seconds> transmission problems, provided the retry interval 
--retry-max-time <seconds> transmission problems, provided the maximum retry time
-S / - show-error error 
--socks4 <host [: port]> socks4 agent with a given host and port 
--socks5 <host [: port]> socks5 proxy with a given host and port 
-t / - -telnet-option <OPT = val> Telnet option is set 
--trace <file> to specify the file Debug 
--trace-ASCII <file> Like - but no trace hex output 
--trace-time tracking / verbose output when, timestamp 
--url <the URL of> SPET to Work with the URL of 
-U / - proxy-the user <the user [: password]> set the proxy username and password 
-V / - version display version information 
-X / - request < command> command to specify what 
-y / - speed-time to give up the speed limit in the time required. The default is 30 
-Y / - Speed limit stop-limit transmission speed, velocity time 's 
-z / - time-cond set transmission time
-0 / - http1.0 1.0 using the HTTP 
-1 / - TLSv1 use TLSv1 (the SSL) 
-2 / - SSLv2 SSLv2 use of (the SSL) 
-3 / - SSLv3 used SSLv3 (the SSL) 
--3p- -Q at The Source for like quote for the uRL of 3rd Party transfer 
--3p-use url url, third party transfer 
--3p-user with a user name and password, and third-party transfer 
-4 / - ipv4 use IP4 
-6 / - -ipv6 use IP6

Guess you like

Origin www.cnblogs.com/fusheng11711/p/10932411.html