[Basic] Performance testing, from 0 to actual combat (hands-on teaching, very practical)

1. Performance basis

What is Performance Testing--->Essence?

Simulate the request sent by the user based on the protocol (business simulation), forming a certain load on the server . Focus: time performance , space performance has nothing to do with the interface

Classification of Performance Tests

  • Performance testing (narrow sense)

The performance test method is to test whether the system performance meets the production performance requirements by simulating the combination of business pressure and usage scenarios in the production environment. In layman's terms, this method is to verify the system capability status under specific operating conditions.
  • load test

By continuously pressurizing the system under test until the performance index reaches the limit, such as "response time" exceeds the predetermined index or a certain resource has reached saturation.
  • Stress test (strength test)

The stress test method tests the system's ability to handle conversations under a certain saturation state, such as CPU and memory usage, and whether the system will encounter errors.
  • concurrency test

The concurrency test method simulates the concurrent access of users to test whether there are deadlocks or other performance problems when multiple users access the same application, the same module or data records concurrently.
  • configuration test

The configuration test method adjusts the software and hardware environment of the system under test, understands the degree of impact of various differences on system performance, and finds the optimal allocation principle of various system resources.
  • reliability test

Under the condition of loading a certain business pressure on the system, let the system run for a certain period of time to check whether the system is stable.

Common Performance Test Indicators

  • Number of users Concurrent users

The number of users sending requests to the server at the same time 
is different from the number of concurrent requests per second. Be sure to confirm whether the purpose of the requirement is the number of concurrent users or the number of concurrent requests
  • Throughput

Description: The number of client requests processed per unit time directly reflects the performance carrying capacity of the software system. 
Typically, throughput is measured in "requests/sec" or "pages/sec".

Tips: 1. From a business point of view, the throughput can be measured by "number of business/hour", "number of visitors/day", "number of business/day", and "number of business visits/day". 2. From the perspective of the network, network traffic can also be measured by "bytes/day" and "bytes/hour". 3. Transactions per second (TPS) and queries per second (QPS) both belong to throughput. The difference is that TPS\QPS describes the specific performance processing capabilities of the server.

  • concurrent number

Description: The number of users for concurrent testing
Extension: 
Number of concurrent users: the number of users who send requests to the system at the same time at a certain physical moment. 
Number of online users: The number of users accessing the system within a certain period of time. These users do not necessarily submit requests to the system at the same time. 
Number of system users: total user data registered in the system.
  • Response time

Description: The time spent in the whole process from when the user initiates a request to when the client receives the result returned from the server.
  • hits

Description: An important indicator to measure the processing power of the web server.
Tips: 
1. The number of hits is not the number of hits that everyone thinks when visiting a page. The number of hits is the number of requests sent to the web server by the elements contained in the page (such as: pictures, links, etc.). 
2. The Hits per Second indicator is usually used to measure the processing power of the web server. 
Note: 
Only web projects have metrics.
  • resource utilization

Description: Refers to the usage of various resources in the system, usage rate = used resources / total resources x 100%

Common resource utilization indicators: CPU, no more than 80% memory, no more than 80% disk, no more than 90% network, no more than 80% If the resource utilization is too small, it will also cause resource waste

  • Error rate

Description: Refers to the usage of each resource in the system. Generally, "resource usage/total resource availability x 100%" is used to generate resource utilization data.

Tips: Usually, if there is no special requirement 1. Different systems have different requirements for the error rate, but generally it does not exceed 5/1000---(5/10000 according to the actual project, etc.). 2. For a system with better stability, its error rate should be caused by timeout --- timeout rate.

  • TPS(Transactions Per Second)

Description: Calculate the number of transactions per second (the number of times the system processes client request transactions per unit time) 
: tps=concurrency/average response time

Transaction: A general term from the code point of view of the business station, which can be understood as one or more pieces of code. Tip: TPS Attributed Throughput

  • QPS(Query Per Second)

Description: Number of queries per second (an important indicator to measure the processing power of a web server)

Application: Control the server to process the specified number of requests per second (for example: control the server to reach 60qps per second, whether the performance indicators of the server are normal).

2. Performance testing process

flow chart

demand analysis

  • Test object

  • common

    • core, important

    • data volume, concurrency

    • example:

      Register, log in, search, add to cart, place an order, pay

  • Determine performance indicators

    Example 1: It is required to complete a transaction amount of 200 million per day, how to find the maximum number of transactions per second? 
    Customer unit price: 200-500, calculated as 300 Calculated 
    using the 28 law, calculated as 24 hours 
    2/8 principle: 80% of user requests are concentrated on 20% of hot data, or time period 
    calculation formula: (200000000/ 300*0.8)/(24/0.2)/3600s=30.86 pieces/s
    
    Example 2: The system pays 5 million users for 8 hours a day to visit 
    1.5 million within 8 hours, 5 million/8*3600, generally not used, unless the system load is relatively stable/average 
    2. First analyze the traffic distribution, and then according to 2/8 The law estimates 
    the number of users requesting 80% per second : 500*0.8=400w 
    20% of the time: 8*0.2=1.6h 
    It is calculated that the server needs to support 694 times/s--->500*0.8/(8*0.2 )/3600s 
    average load per hour*4 (estimated, this calculation is not recommended)
    
    • Throughput, the number of requests processed by the TPS server per second

    • Response time

      The processing time required from the browser sending the request, the server processing, to receiving the response

    • User number

    • resource utilization

    • example:

  • testing scenarios

    • single scene

      Log in

      register

      search

      add to cart

      place an order and pay

    • mixed scene

      User usage scenarios

      System usage scenarios

Test Plan

  • test target

  • tester organization

  • Pressure test schedule

  • press machine

    • configuration

    • Require

    • quantity

  • risk

Test program

  • test tools

    loadrunner

    jmeter

  • test environment

    database

    server

    architecture design

    Try to be consistent with the production environment if possible

  • Test Strategy Single Scenario Mixed Scenario

  • monitoring tool

    • Linuxnmon rpc jvisualVm Spotlight

    • windowsSpotlight perfmon.exe

use case design

  • Test script script-based use cases

  • Scenario design Scenario-based use cases

test execution

  • script writing

  • Scenario Monitoring Design Business Design

    • scene construction

      Explanation: The important principle of test scenario design is to display the use case design scenario based on the test cases.

      Tips: 
      1. The number of virtual users and the way to start virtual users 
      2. Settings related to the scene (such as: assembly point) 
      3. Whether the script has dependencies (such as: login and registration)
      
  • Run the scene

    Description: Running the script is running the scene

    1. The load test machine cannot run the set number of virtual users 
    2. There is no "warm-up" process 
    3. There is no real environment for simulating users 
    4. The number of performance test cases is too small
    
  • Monitoring scene

  • testing report

positioning analysis problem

  • rear end

    • the code

    • Software (service) database application server

    • hardware

  • front end

  • Sequence of network test positioning problems: hardware problem ---> network problem ---> application server, database server configuration problem ---> source code, database script ---> system architecture problem

performance tuning

After comparing the test results, performance testers find the bottleneck of system performance.

Tips: 
1. Tuning personnel: development-oriented, database administrators, system administrators, network administrators, and performance test analysts cooperate to tune performance problems 2. Verification 
: Performance test verification usually requires many rounds; each round Regression requires a comprehensive comparison of all test indicators

System tuning from easy to difficult order:

  • hardware problem

  • Internet problem

  • Application server, database server configuration problems

  • source code, database script

  • system architecture issues

testing report

  1. Review of the overall performance testing phase (covering requirements, testing the progress and products of different phases, and analysis of performance testing results) ---> technical perspective

  2. Risk management in the overall performance testing phase ---> management perspective

  3. Summary of project performance test results (whether passed or not, experience, lessons learned)

3. Tool introduction and selection

LoadRunner

  1. An industrialized performance testing tool that can support a large number of users and provide detailed reports to provide the quantity of test analysis

  2. Many supported protocols

  3. written in C language

advantage

1. Support a large number of users (in tens of thousands) 
2. Provide accurate reports 
3. Support ip spoofing

shortcoming

1. Fees 
2. Large size 
3. Cannot be customized

Jmeter

jmeter is a performance testing software developed by Apache organization based on java. Multi-protocol (HTTP/HTTPS, JDBC, JAVA...etc)

advantage

1. Open source and free 
2. Small size 
3. Rich third-party plug-ins

shortcoming

1. Does not support ip spoofing 
2. The accuracy of the report is worse than LR

How to choose between LoadRunner and Jmeter?

  1. Prefer to choose Jmeter

  2. Jmeter can solve it with Jmeter, and Jmeter can't solve it with LoadRunner

Fourth, the use of Jmeter tools

File directory introduction

1.1 bin directory

Store executable and configuration files

jmeter.bat: windows startup file 
jmeter.log: log file jmeter.sh: 
linux startup file 
jmeter-properties: system configuration file 
jmeter-server.bat: server configuration for windows distributed testing 
jmeter-server: linux Server configuration for distributed testing

1.2 docs directory

docs: It is the api document of Jmeter, which can be viewed by opening the api/index.html page

1.3 printable_docs directory

The content under the usermanual subdirectory of printable_docs is the user manual document of Jmeter. 
component_reference.html under usermanual is the most commonly used core component help document.

Tip: There are some commonly used Jmeter script cases in the demos subdirectory of printable_docs, you can refer to them.

1.4 lib directory

This directory is used to store the jar packages that Jmeter depends on and the jar packages that user extensions depend on.

basic configuration

Chinese settings

  • Temporary modification:

    options--->language--->choose language--->Chinese

  • Permanent modification:

    1. Open jmeter.properties

    2. Modify language=zh_CN

    3. restart jmeter

theme modification

Options ---> Themes ---> Select the corresponding theme, restart jmeter

basic operation

  1. start jmeter

  2. Add thread group

  3. Add a sampler for http requests and configure

  4. Add a listener for the query result tree

  5. Click "Start" to run jmeter and view the results

basic components

Thread group: impersonates the user. Configuration components: Initialize the test environment and test data ---> such as setup preprocessor in automation scripts: preprocess the requests to be sent ---> such as parameterized samplers in automation scripts: send requests to the server --- -> For example, the post-processor of the code sending the request in the automation script: extract the data from the response received from the server ---> For example, in the automation script to obtain the specific field statement assertion in the response: compare the received response result with the expected result ---> For example, the assertion listener in the automation script: view the results and logs after the test script runs ---> For example, the test report timer in the automation script: wait for a period of time ---> For example, the sleep test fragment in the automation script: encapsulate the basics Function, not executed alone, needs to be executed through script calls ---> such as encapsulating functions in automation scripts

scope

Core: Determined according to the parent-child nodes of the tree structure in the test plan

in principle:

  • Samplers are not scoped.

  • Logic controller: only valid for all components under its child nodes.

  • other components.

    • If its parent node is a sampler, it is only valid for the parent node sampler.

    • If its parent node is not a sampler, it is valid for all child nodes under the parent node and the child nodes of the node.

Execution order of components

Order: configuration element ---> pre-processor ---> timer ---> sampler ---> post-processor ---> assertion ---> listener

Notice:

  • Configuration components, pre-processors, and post-processors all need to rely on samplers to run

  • Under the same scope, the execution sequence of elements of the same type is executed sequentially from top to bottom

Three important components of Jemter

thread group

Function: Simulate users by configuring the number of threads in the thread group. The number of threads is the number of users, and the thread group is the user group

Features:

  • simulate multiple users

  • Samplers and logic controllers must be used under a thread group

  • Multiple thread groups can be added under a test plan, which can be executed in parallel or serially

    • Parallel: Thread groups are executed in parallel by default

    • Serial: Check "Run each thread group independently" under the test plan

Classification of thread groups:

  • setup thread group: has pre-test preprocessing operations, which are executed first in all thread groups

  • Ordinary thread group: to execute business test scripts

  • teardown thread group: used for post-processing (data, recovery environment) operations after testing, executed last in all thread groups

Properties of the thread group

Number of threads: number of simulated virtual users

Ramp-up time: the time required for the virtual user to start

Cycles:

  • Configure the specified number of times: control the number of times the script runs and executes

  • configure loop forever

    • Requires scheduler configuration using

    • run time: the time the script was executed

    • Delayed start time: the script waits for a specific amount of time before it starts running

http request

http protocol: it can be filled in as HTTP or HTTPS, and the default is not filled in as HTTP protocol

http host name/ip: such as: http://baidu.com 80

Port: It can be filled with any value. Default is port 80 when not filled in

Request sending method: all methods supported by the HTTP protocol

Path: directory + parameters

Encoding format: the default IOS international standard, utf-8 is recommended

view result tree

Sampler Results: Information about the statistics request

Request: details of the request header and request body of the HTTP request

Response: details of the response header and response body of the HTTP response

When garbled characters appear in the jmeter response:

  1. Modify the jmeter.properties file, sampleresult.default.encoding=utf-8

  2. restart jmeter

Jmeter parameterization common way

user defined variable

  • Method 1:

Add: thread group ---> configuration element ---> user-defined variable 
configuration: parameter name + parameter value 
use: refer to the definition variable in the sampler of the HTTP request. $(parameter name)
  • Method 2:

Configure: To configure user-defined variables in the test plan 
Use: Reference the defined variables in the HTTP request sampler. $(parameter name) 
application scenario: when the parameter values ​​in a large number of scripts need to be modified, it is more convenient to directly modify the middle value of the user-defined variable

user parameters

Add: thread group ---> preprocessor ---> user parameter configuration:

  • Parameters: add variable

  • Parameter value: add user ---> configure different parameter values ​​for each user

Use: Reference the defined variable in the sampler of the HTTP request. $(parameter name)

Application scenario: Different parameter values ​​can be obtained for different users

CSV Data Set Config

Add: thread group ---> configuration element ---> CSV data file settings

Write CSV data file (.csv as suffix):

  • Multiple parameters are written as multiple columns, separated by commas

  • For multiple sets of parameter values, use multiple lines to set

Configuration:

  • path

  • File encoding: UTF-8

  • Variable Name: The data read from the CSV data file needs to save the variable name. Separate multiple variables with commas

  • Whether to ignore the first line: whether to start reading from the first line of the CSV file

  • Delimiter: It is required to be consistent with the delimiter of multiple columns in the CSV data file

  • Whether to loop again when encountering the end of file: default TRUE

  • Whether to stop the thread when encountering the end of the file: the current parameter is FALSE, this parameter is valid, generally set to TRUE

function

counter:

  • TRUE: each user uses a separate counter

  • FALSE: all users use global counters

Reference: Use $(__counter(FALSE,)) in the sampler to refer to the corresponding value

It is recommended that you use the function method

Jmeter assertion

Function: During the automatic execution of the script, when it is possible to automatically determine whether the execution result meets the requirements, an assertion needs to be added

response assertion

Add: Thread Group--->HTTP Request--->Assertion--->Response Assertion

Configuration:

  • Test field: the field that needs to be checked

  • Pattern matching rules: what rules need to be used to check

  • Test mode: the value that needs to be verified

Json assertion

Applies to returned HTTP responses in JSON format

Add: Thread Group--->HTTP Request--->Assertion--->JSON Assertion

Configuration:

  • JSON PATH:$.weatherinfo.city

  • Check "Addltonal assert value"

  • Fill in the expected value in expected value

Assertion duration:

When applicable to performance testing, check whether the response time of HTTP requests exceeds the expected value

Add: Thread Group--->HTTP Request--->Assertion--->Assertion Duration

Configuration: expected time

Jmeter associations (extractors, databases, logic controllers, etc.)

When there is a dependency between multiple requests, and the parameters of the latter request need to use the response data of the previous request, association is required.

Classification:

  • Regular Expression Extractor

  • xpath extractor

  • Json extractor

extractor

regular extractor

Add: thread group--->HTTP request--->post processor--->regular expression extractor

Configuration:

  • Response fields to check: Default body

  • Reference name: the variable name to store the matched data

  • Regular expression:<p>(.*?)</p>,"()"里是要保存的数据

  • Template: $1$

    • Data 1 represents the number () in the regular expression above

  • Matching numbers: 0 for a random value, 1 for the first result, -1 for all results

  • Default value: store the value in a variable when there is no match

xpath extractor

Add: thread group--->HTTP request--->post processor--->xpath extractor

Configuration:

  • Reference name: the variable name to store the matched data

  • xpath path: xpath matching rules

  • Matching numbers: 0 for a random value, 1 for the first result, -1 for all results

  • Default value: store the value in a variable when there is no match

json extractor

Add: thread group--->HTTP request--->post processor--->json extractor

Configuration:

  • Reference name: the variable name to store the matched data

  • json path: json path. $.weatherinfo.city

Reference: Just refer to the variable name directly

database

Connection preparation:

  • Open the database and determine the tables and corresponding fields of the database

  • Load the jdbc driver of mysql

    • Method 1: Add the jdbc driver through the test plan and browse

    • Method 2: Put the jdbc driver jar package into the lib\ext directory, and restart jmeter

  • deploymentjdbc connection configuration

    • created pool name: Name the connection pool for subsequent references

    • Database URL: jdbc:mysql://127.0.0.1:3306/test

    • username

    • password

Directly connected to the database using:

  • Add JDBC Request: add under sampler

  • Configuration:

    • If the SQL statement returns multiple parameters, enter the same number of variable names to save

    • Configure connection pool name

    • Configure SQL statement

    • The name of the saved variable

  • In HTTP assertion, variables can be referenced for judgment

logic controller

Execution sequence of control elements

if controller

Add: thread group--->logic controller--->if controller

Configuration:

  • Use JS pre-release: "${name}"=="baidu"

  • The way to use the jmeter function: ${__jexl3("${name}"=="baidu",)}

  • Recommended way to use functions

cycle controller

Specify the HTTP request to execute a specific number of times

Add: Thread Group ---> Logic Controller ---> Loop Controller

Configuration: times

The cycle number configuration m in the loop controller is compared with the cycle number n configuration in the thread group:

  • Relationship: If configured at the same time, the number of HTTP requests actually executed under the loop controller should be n*m

  • Difference: The scope of these two loop times is different

ForEachController

Used with user-defined variables or regular expression extractors, loop through the returned variable values, one or more times.

  1. Work with user-defined variables

    Add: Thread Group--->Logic Controller--->ForEach Controller

    Configuration:

    • variable name to quote the output

    • variable prefix: fixed prefix configured in user-defined variables

    • Starting number: minimum value of consecutive numbers - 1

    • End number: the maximum value of consecutive numbers

    • Output variable name: read the variable value sequentially and store it in the parameter, and refer to it by HTTP request

    • Variable name: fixed prefix + consecutive numbers

    • user defined variable

    • ForEachController

    • HTTP request:

  2. Works with regular expressions

    • First use the regular expression extractor to extract all the data that meets the conditions in the request

    • Add a ForEach controller, and configure to extract all data that meets the conditions, and save it as a variable

    • Under its child node, add an HTTP request and quote a variable to read all the data matched in the regular expression in a loop

timer

synchronous timer

When a large number of users' concurrency tests are required, in order to allow users to execute at the same time, add a "synchronous timer" to block the thread until the thread reaches the preset value before starting the sampler operation.

Configuration:

  • Concurrency: How many users are reached at the same time before starting to send requests

  • overtime time:

    • Must be configured: otherwise, when the number of virtual users cannot be divisible by the concurrent number, some users will hang and cannot execute

    • The configuration cannot be too short: it must be longer than the concurrent loading time. Otherwise, the requirement of concurrent number cannot be met, and the data will be released

Constant Throughput Timer

It is used in the performance test to simulate the business pressure generated by the user, and sends a fixed frequency request to the server through a given QPS.

Add: Thread Group ---> HTTP Sampler ---> Constant Throughput Timer

Configuration: the value of throughput QPS*60

distributed

principle:

  • Distributed testing is divided into one control machine and multiple agent machines

  • The control machine is responsible for issuing test tasks to the agents

  • The agent machine receives the task and sends a request to the server, and receives the response returned by the server, and then returns the test result to the control machine

  • The control machine conducts summary statistics on the test result data

Distributed related notes:

  • All test machine firewalls have been turned off

  • All test machines and servers are in the same network

  • The jmeter version of all test machines is exactly the same as the JDK version

  • Turn off the RMI SSL switch in jmeter

distributed configuration

configuration

  • Proxy

    • server_port: not repeated. If you use multiple machines as agents, you don’t need to configure

    • Disable RMI SSL

  • control machine

    • remote_server: IP+port of all agents, separated by commas when there are multiple agents

    • Disable RMI SSL

run

  • Proxy

    • jmeter-server.bat run

  • Control machine:

    • jmeter.bat run

    • Control the agent machine to execute the script, run ---> remote start all

Explanation of common terms in performance testing

Performance testing, some technical terms, in order to facilitate everyone's understanding, here is an explanation in easy-to-understand language, if there is any inaccuracy, thank you for correcting.

Concurrency: 
Number of tps threads: Number of people participating in the race on the track 
Iteration: How many laps each person runs 
Loop: In one iteration, one of the scripts is run in a loop, which is to run one of the tracks repeatedly Parameter value: The data 
parameterization 
used when sending the request : This is a strategy, and its specific usage is introduced above. 
Thinking time: Simulate user waiting time 
Association: The input parameter of the next request depends on a certain return value in the previous request. 
Checkpoint: To judge whether the request is successful, generally only Checkpoints will be added to query requests, that is, assertion 
collection points: wait for all users to initiate requests at the same time. The main application scenario is the seckill 
transaction in shopping: generally define one or several requests under test together as one Transaction is an artificial test definition, which can be the entire order process, or a request load in the order 
: the busyness of the server, if a server can process 8 requests at the same time at a time, if the number of requests is large, later Requests are queued, the more queued requests, the higher the server load 
Average response time (art): each transaction processing time, from sending a request to receiving a response 
tps: number of transactions processed per second 
hit rate per second (number): The number of requests processed per second, not the number of requests sent by users per second

Performance learning route: jmeter→java basics→beanshell→architecture knowledge→linux analysis and tuning→various middleware positioning and tuning

Performance testing, from 0 to actual combat (including popular mainstream technologies docker, k8s, skywalking, full link, microservices, performance tuning, etc.)


 Meager strength [resource sharing]

Finally, I would like to thank everyone who has read my article carefully. Seeing the fans’ growth and attention all the way, there is always a need for reciprocity. Although it is not a very valuable thing, you can take it away if you need it:

The complete software testing video learning tutorial below has uploaded the QR code officially certified by CSDN. If you need it, you can get it for free [guaranteed 100% free]

These materials should be the most comprehensive and complete preparation warehouse for friends who want to advance [automated testing]. This warehouse has also accompanied me through the most difficult journey, and I hope it can help you too! Everything should be done as early as possible, especially in the technical industry, we must improve our technical skills. I hope it will be helpful to everyone... Basic knowledge, Linux essentials, Shell, Internet program principles, Mysql database, packet capture tool topics, interface testing tools, advanced testing-Python programming, Web automation testing, APP automation testing, interface automation Free sharing of supporting learning resources such as testing, testing, advanced continuous integration, testing framework development, testing framework, performance testing, and security testing~

Guess you like

Origin blog.csdn.net/myh919/article/details/131380757