Jmeter's automated testing implementation

Foreword:

Jmeter is currently the most popular testing tool. Based on this tool, we have built a complete set of automation solutions, including script addition configuration, local configuration and operation, server configuration, etc., and completed the closed loop of automated testing. Through this fast and easy An efficient way, hoping to solve the pain points of automation testing. The following gossip is less, let's go straight to the actual combat:

1. Prepare automated test materials

    1. Development and operation tool Jmeter, (download address:  Apache JMeter - Download Apache JMeter )

    2. The development environment is released ready;

    3. The test script is ready;

    4. The script running environment has been prepared (fat or uat);

2. Automated test passing standard

    1. Successful Status returns 200;

    2. Failure returns 404, 500, etc.;

    3. Special assertions for each script;

3. Automated script storage

    The scripts are all stored in the Gitlab warehouse, (for the storage specifications of the scripts, please refer to: GitLab Jmeter Test Package General Design Version 1.0)

Create a new folder in the root directory of the project. The folder name in front is consistent with the project name, and the suffix is ​​"-test", as shown in the following folder:

    

4. Automated Test Script Rules

    1. The script is named as the interface name

    2. The storage type is the file with the suffix jmx

    3. The number of threads is set to 1 (the smoke test does not require multi-threaded concurrency)

    4. It must include assertion judgment, and the status detection is set to 200

Five, automated test Script Steps

    1. Add a thread group, name the script as the interface name, and click to save the file with the suffix jmx

    2. Set the number of threads to 1, and other settings to default (as shown below)

    3. The number of cycles is no cycle, the scheduler is not set, and other parameters are not set (as shown in the figure below)

            

 

   

    4. Configure the http protocol options:

        a. To add an http request, right-click the thread group to "Add", select "Sampler", and then select "HTTP Request":

              

        b. The request is post or get (as shown in the figure below, here it is set to POST according to the actual situation)

        c. Content encoding: UTF-8 (as shown in the figure below, here is set to UTF-8 according to the actual situation)

        d. Request message: Fill in the Body Data (as shown in the figure below, set the content of the Request message according to the actual situation)

                

 

   

 5. Configure the assertion option, right-click the thread group, "Add", select "Assertion", and here select "Response Assertion". Note that other assertion types can be selected according to the actual situation:

        a. Add a response assertion, as shown in the figure below;

            

        b. Select the response field as the response code, as shown in the figure below;

        c. Select Matching Rules as Include, as shown in the figure below;

        d. Fill in 200 in the test mode, as shown in the figure below (this is to develop a custom return code, please set it according to the actual situation);

            

 

 6. Add "HTTP Header Manager", right-click the "Add" menu in the test plan, select "Configuration Components", and then select "HTTP Header Manager" (that is, HTTP Header Manager), as shown in the figure below:

            

    Content such as Content-Type can be added according to the actual situation. For example, the value here is set to: application/json, as shown in the following example:

 

7. Add "View Result Tree", right-click the "Add" menu in the test plan, select "Listener", and then select "View Result Tree":

 

If the operation succeeds or fails, you can view the details of the result tree respectively, as shown in the figure below, view the response data, and you can see the response message:

 

The following figure is an assertion that fails to run, and the assertion failure information will also be described in detail:

 

8. Add "View Results in Table", right-click on the test plan to "Add" menu, select "Listener", and select "View Results in Table":

 

After each test plan is added, you can view the running results of all thread groups, as shown in the following figure:

 

9. Add "Aggregate Report", right-click the "Add" menu in the test plan, select "Listener", and select "Aggregate Report":

 

The aggregation report is mainly used to collect some important performance index values ​​of this test plan run, such as Average, 90%Line, etc. After adding, you can view the effective data of the index after each test plan run, as shown in the following example:

 

6. Local Execution

1. In the current test plan, add the automation scripts that need to be run this time one by one according to the above example. The effect after adding is as follows:

 

2. Run all the use cases of the current test plan, and then use to view all the running results as follows:

 

3. Also in the result tree, you can view the detailed messages of all failed use cases:

 

7. Server configuration and operation (Jenkins/TFS)

Since the lowest-level job scheduling is implemented by Jenkins, this section focuses on how to configure job parameters and other content. TFS only needs to directly schedule Jenkins through the interface or command line.

1. Job naming convention, which defines the job naming convention in Jenkins, as shown in the figure below, starting with "JMETER" followed by the service name:

 

2. Add build configuration, as shown in the figure, where devops is a packaging script written based on Python, /sso-support-test is the address of the Jmx script file, report_server is the mail service, and mail is the email address of the report recipient:

 

3. Build trigger configuration: As shown in the figure, here is the configuration to check whether the server has been updated every two minutes:

 

4. Source code management: Add and select the Git warehouse, then configure the Repo address, and finally select the branch dev (not mandatory)

 

Eight, server production automation report

Automated reports currently support two forms, email and Html report viewing.

1. Check the report by email. After configuring the email address of the responding recipient on the server side, an email report will be automatically sent to the designated follower after each automation run, as shown in the following figure:

 

The mail Report mainly includes the following contents:

Summary column: contains the summary of all requests, the total number, the number of failures, the success rate, the average time, the minimum time, the maximum time, etc.

 

Pages column: mainly contains the details of a single request:

 

Failure Detail: mainly error details related content:

 

2. Html report view. The Html content display is basically consistent with the email at present, so I won’t go into details, as shown in the following figure:

 

2023 latest Jmeter interface test from entry to proficiency (full set of practical project tutorials)

Guess you like

Origin blog.csdn.net/dq565/article/details/132627462