http_load is a performance testing tool based on linux platform. It runs in parallel multiplexing mode to test the throughput and load of the web server and test the performance of web pages.
content
advantage
edit
1. Based on the command line, simple and easy to use
2. Small and lightweight, less than 100k after decompression
3. Open source, free
shortcoming
1. Only suitable for performance testing of web pages, not for accessing databases
2. Limited analysis of test results
3. Platform depends on linux
Install
edit
Enter the working
directory
: #cd /usr/local/
download http_load:#wget http/http_load_tar_gzUnzip
:#tar zxvf http_load-12mar2006.tar.gz
Enter the http_load directory:#cd http_load-12mar2006Compile
: #makeinstall
:#make install
If an error is reported: "Unable to create general file '/usr/local/man/man1': No such file or directory"
can be mkdir /usr/local/man and make install again
download http_load:#wget http/http_load_tar_gzUnzip
:#tar zxvf http_load-12mar2006.tar.gz
Enter the http_load directory:#cd http_load-12mar2006Compile
: #makeinstall
:#make install
If an error is reported: "Unable to create general file '/usr/local/man/man1': No such file or directory"
can be mkdir /usr/local/man and make install again
Use parameters
edit
-fetches shorthand -f: means the total number of visits
-rate shorthand -r: means the frequency of visits per second
-seconds shorthand -s: means the total access time
-parallel shorthand -p: the number of concurrently accessed threads
urls are A list of urls, each on a separate line. Can be a single page.
-rate shorthand -r: means the frequency of visits per second
-seconds shorthand -s: means the total access time
-parallel shorthand -p: the number of concurrently accessed threads
urls are A list of urls, each on a separate line. Can be a single page.
application
editAverage number of visits per second to the test site
http_load -parallel 5-fetches 1000urls.txt This command line uses 5 processes at the same time to randomly access the URL list in urls.txt for a total of 1000 visits. Result after running:
1000 fetches, 5 max parallel, 6e+06 bytes, in 58.1026 seconds
6000 mean bytes/connection
17.2109 fetches/sec
, 103266 bytes/sec
msecs/connect: 0.403263 mean, 68.603 max, 0.194 min
msecs/first-response: 284.133 mean, 5410.13 max, 55.735 min
HTTP response codes:
code 200 — 1000
From the above results, the target website can only withstand 17 visits per second, which is not strong enough.
Test whether the website can withstand the expected traffic pressure
http_load -rate 2-seconds 300urls.txt
Maintain a certain frequency to visit the target url within 300 seconds.
Note:
-
urls.txt holds a list of urls to visit, one per line
-
Don't test the website after it goes live, it's not fun to be overwhelmed