Standard process of Jmeter performance testing

1. Evaluation of the necessity of performance testing

Common Key Evaluation Items

Regulators require performance reports

Systems involving property and life safety

Large-scale systems in production for the first time

Core database, software and hardware upgrades

The number of users and business volume increased by more than 30%

Single version single business evaluation weight

Whether the core position of the platform

Whether there is deployment method adjustment or optimization

Is there a tuning with a higher performance risk added?

Is there a business process that the customer requires must be tested

Whether it involves the repair of multiple functional defects and the process has undergone major changes

2. Performance test requirements analysis

business level

Features heavily used by users

Daily business accounts for more than 80% of the business

80% of the business on special trading days or peak

Businesses whose core business processes have undergone major adjustments

project level

Businesses that have tested performance tuned architectures

Logically complex and critical business

Potentially resource-intensive business

There are interface calls and a large number of interactive services with external systems

Businesses that call third-party business components and have complex logic

Performance testing requirements review

Testability

Can build a relatively real environment

consistency

User requirements, production requirements (authenticity), operational requirements (planned future development requirements)

correctness

3. Performance test case design

Test model modeling

Example : Login business operation process (mind map)

open the homepage

Enter username and password to log in

Exit system

Scenario use case design

Classification

Single-service benchmark test: whether the system design and user expectations are met

Single service stress test: the duration of continuous service under the maximum load

Single business load test: the maximum load the system can withstand

Comprehensive business stress test

Comprehensive business load testing

Comprehensive business stability: the ability to run system stable services for a long time under the core business benchmark load

Count of threads

Scenario use case

picture

Script use case design

picture

4. Test data structure

Script development Create user registration script

The recording script is exported as jmx

Jmeter iteratively generates accounts

${username} variable to import into CSV

 

picture

insert image description here

5. Test script development

Script development, recording, login and purchase scripts

Jmeter configuration

Add -> Timer -> Fixed Timer: Set interval time

Add -> Assertion -> Response Assertion: Check for successful login

picture

Add -> Listener -> View Results Tree/Aggregate Report

Use of Fiddler

If the script development does not record the request to add the product to the shopping, you need to use Fiddler to capture the package and add it manually

picture

 

Add->Sample->HTTP request

picture

6. Scene design and implementation

Number of concurrent threads and scheduler configuration

picture

If it is a script recorded by script development, the cycle setting is set in Step1 forever

monitoring results

picture

Resource listener gc-perfMon Metric Collector

 

download:

Address https://jmeter-plugins.org/downloads/all/ , download plugins- manager.jar

Put the given file in the apache-jmeter/lib/ext directory

Add plugin:

select, reboot

picture

Add listener:

Can be added after restart - listener - @gc - perfMon Metric Collector

Save after adding indicators such as CPU and memory

picture

 

7. Use case execution
environment

Pay attention to client performance

Note that it is best for the server to be able to exclusively test

Pay attention to the choice of time, the test environment/production environment is best when few people use it

Record server configuration

Test server configuration:

Application server-model-number-CPU-memory-IP

Database server-model-number-CPU-memory-IP

Test client configuration:

Client-Model-Number-CPU-Memory-IP

run task

8. Result Analysis
Response Time

picture

Apdex

picture

Business success rate (see assertion)

Assertions are set in the test script to determine whether the words "login successful" appear after the user logs in, and the "Assertion Result" viewer is set. By viewing the assertion results, all pass means that the business success rate is 100%

picture

 

concurrent number

CPU and memory

picture

database

picture

Statistics

picture

 

9. Performance tuning

Symptoms of performance problems

Response time is smooth but long

Response times get progressively longer

Response time varies with load changes

Data accumulation leads to locking

poor stability

The response time is long, the system is getting slower and slower, and business errors occur, usually due to

Insufficient physical memory resources; memory leaks; resource contention; external system interaction; business failures, frequent restarts, no termination status; unreasonable middleware configuration, unreasonable database connection settings; wrong process/thread design

Finally, I would like to thank everyone who has read my article carefully. Reciprocity is always necessary. Although it is not a very valuable thing, you can take it away if you need it:

These materials should be the most comprehensive and complete preparation warehouse for [software testing] friends. This warehouse has also accompanied tens of thousands of test engineers through the most difficult journey, and I hope it can help you! Partners can click the small card below to receive 

Guess you like

Origin blog.csdn.net/kk_lzvvkpj/article/details/130138956