When developing a web page or application program background interface, it is generally necessary to timely test whether the developed interface can receive and return data correctly. For a single test, the Postman plug-in is a good Http request simulation tool.
However, Postman can only simulate a single request from a single client. For performance tests such as simulating multi-user concurrency, other tools must be used. The powerful JMeter automated testing tool is recommended here.
Apache JMeter - Apache JMeter™
Apache JMeter is a Java -based stress testing tool developed by the Apache organization . For stress testing software, it was originally designed for web application testing but has since expanded to other testing areas. It can be used to test static and dynamic resources such as static files, Java servlets , CGI scripts, Java objects, databases, FTP servers, and so on. JMeter can be used to simulate huge loads on servers, networks or objects, test their strength from different stress categories and analyze overall performance.
The following is a brief introduction to the method of using Jmeter for interface testing, taking the Restful type interface in the above figure as an example.
Get basic information about a specific gateway device.
ask
Method: GET
URI:/api/gateway/<gateway_id>
parameter:
Add Userid and Token fields as user authentication fields in the header of the HTTP request (http request)
parameter name |
required |
Types of |
illustrate |
gateway_id |
true |
int |
Gateway ID, placed in the URL |
response
Return value: Gateway information in JSON format
parameter:
parameter name |
Types of |
illustrate |
id |
int |
gateway id |
name |
string |
the name of the gateway |
mac |
string |
Gateway's mac address |
fw_ver |
string |
The firmware version number of the gateway |
sub_dev |
unsigned int |
Number of child devices |
did |
string |
didd of the gateway |
pscode |
string |
gateway pscode |
1. Start jmeter: run jmeter.bat as an administrator under bin, and start jmeter
2. Create a test plan:
When jmeter is started by default, a test technology template will be loaded, and the test plan will be saved: change the name to Apitest, click Save, select the save path, and in the following steps, every time you add or modify some options, the software will not automatically save it to the jmx file , so after the test, if you need to save the test options, you need to manually save it in the "File" menu.
3. Add a thread group
Right-click the test plan "Apitest" node in the tree on the left, "Add" → "Threads" → "Thread Group"
After the addition is successful, there is a "Thread Group" node under the "Apitest" node. Of course, the name of this thread group can be set by yourself.
4. Add http default request : (used to configure public parameters, not http request)
Right-click the thread group, select "Add" → "Configuration Element" → "HTTP Request Defaults", click "HTTP Request Defaults"
After the addition is successful, the "HTTP Request Default Value" node is added under the "Thread Group" node.
Here you can set the host address and other public parameters . For example, in our example, the request path is preceded by the host address + index.php, which can be set in the "http request default value".
Fill in the default request name, server, and default request path, and save the test plan.
5. Add http request header
This item is not required, but in our example, Userid and Token are used in the HTTP request header for user authentication
Right-click "Apitest" and select "Add" → "Configuration Element" → "HTTP Header Manager"
6. Add http request
Right-click "Apitest" and select "Add" → "Sampler" → "HTTP Request"
After the addition is successful, a new node "HTTP request" appears, and you can fill in the specific request parameters.
Fill it out and save the test plan
7. Add the listener :
Right-click the thread group and select "Add" → "Listener" → "XXXXXXXXX"
There are many kinds of listeners that can be added, and multiple listeners can be added. Here we add several commonly used "graphical results", "view result tree", "aggregation report"
After the addition is successful, several nodes are added under "Thread Group"
8. Test run
Click to execute
You can view the results of each "listener"
Label : Each JMeter element (such as HTTP Request) has a Name attribute, and the value of the Name attribute is displayed here
#Samples : Indicates how many requests you made in this test. If you simulate 10 users and each user iterates 10 times, then 100 is displayed here
Average : Average response time - by default, it is the average response time of a single Request. When the Transaction Controller is used, the average response time can also be displayed in units of Transactions
Median : median, which is the response time of 50% of users
90% Line : Response time for 90% of users
Note: For the meaning of 50% and 90% concurrent users, please refer to the following
http://www.cnblogs.com/jackei/archive/2006/11/11/557972.html
Min : Minimum response time
Max : maximum response time
Error% : The number of requests with errors in this test / the total number of requests
Throughput : Throughput - By default, it indicates the number of completed requests per second (Request per Second). When the Transaction Controller is used, it can also indicate the number of Transactions per Second similar to LoadRunner.
KB/Sec : The amount of data received from the server per second, equivalent to Throughput/Sec in LoadRunner
9. Modify parameters such as the number of threads in the thread group for stress testing
Click "Thread Groups" in the tree navigation on the left
Set the above parameters, the total number of simulated requests is: the number of threads * the number of loops. Execute it and check it out with the "Graphics Results" listener
Number of samples: The total number of requests sent to the server.
Latest samples: A number representing the time it took for the server to respond to the last request.
Throughput: The number of requests the server processed per minute.
Average: Total runtime divided by sent to server number of requests.
Median: The number of times at which half of the server response times were below and the other half were above.
Deviation: The magnitude of the server response time variation, measure of dispersion, or, in other words, the distribution of the data.