Performance test report templates

Performance Test Report

 

Performance Test Report

 

 

 

project

XXX project two

version

V1.00

Author

dayu

date

2019.9.31

 

 

 

1.  Testing Overview

1.1  test target

Description meaning and purpose of this test

The purpose of this test is to probe the XXX project two reconfigurable systems business processing performance environment, and system performance under high load conditions.

1.2  indicators and terminology

The term describing the performance involved in the test

the term

DEFINITIONS

Concurrent

At the same time the number of transaction requests issued by the system during the test, while the number of concurrent threads to simulate the user to connect to the system.

TPS (transactions per second)

Number of transactions per second at the time of the system can be processed. TPS to a large extent reflects the system performance capability.

Error rate

The probability of errors by the transaction processing system, corresponding to the actual user to use the system function failure. The error rate should be kept at very low levels under ideal circumstances.

Resource utilization

Using the ratio of the critical resources of the server, used to measure the ability of the system hardware

 

 

2.  environment, tools

The test in question is listed servers, clients, and testing tools

2.1  Test Environment

server:

application

machine

CPU, memory configuration

API

ip address

16 core the CPU , memory 16G

MYSQL

ip address

16 core the CPU , memory 16G

 

 

Client computer:

operating system

CPU

RAM

Windows10 Pro

I3- 4170 3.70GHZ

8G

 

2.2  Test Tool

Core Tools

version

Remark

Jmeter

3.3

The ability to provide concurrent requests

PerfMon Metrics Collector

2.1

Jmeter plug-in to collect server resource usage information

ServerAgent

2.2.1

Server resource usage information is sent to the servo form

nMon

16h v2

Real-time collection server resource information

 

3.  Testing program

3.1  Test Type

Different performance test scenarios may use different types of tests, need to be clear

The performance test will adopt the following types of tests:

Benchmark:

In small concurrent conditions, the performance of each performance index detection system, as a basis for subsequent comparison.

 

stress test:

Unable to estimate accurately the amount of user access, so consider using stress testing methods. Stress tests aimed at increasing the system by the number of transactions processed concurrently, increase the load on the system until the system reaches the performance bottleneck. On this basis the system can carry out the transaction and the user requests.

 

Stability Test:

The longer the system is placed under a high load scenario, whether there is a defect detection stability.

3.2  Business Model

For system interface, exactly what needs to be included in the scope of the pressure measured? Different transactions should be called in what proportions, it is also one of the difficulties to be modeled performance test design.

, The following business model to simulate the design and testing for the project by the architecture and business scene analysis:

 

Scenario 1: Simple Business scene

business name

interface address

Request type

Concurrent proportion

log in

/login

post

1

Query User Information

/queryMemberInfo

get

1

 

 

 

Scenario 2: Mixed Business scene

business name

interface address

Request type

Concurrent proportion

log in

/login

post

1

Query User Information

/queryMemberInfo

get

1

Transaction inquiry

/ List orders sida

get

1

Order creation

/createOrder

post

1

 

3.3  encryption processing inspection sign

Since the system for all transaction requests are encrypted sign inspection process, so in this performance test, the need for a consistent request packets encrypted and signed. Processing logic is as follows:

Use APP same cryptographic signature code is derived jar package as encryption tools

using jmeter preprocessor -beanshell processor invokes the above jar packet encrypted request parameter implemented method

l request parameters stored encrypted signature variables, subsequent use interface call

 

3.4  pressure gradient

To 3.2 of the scene, each pressure gradient, from 100 concurrent and increments 100 the number of concurrent, until the system reaches a bottleneck.

 

4.  Test Results

4.1  Aggregate Report

label

Number of samples

Average (response time ms)

Least

maximum

Error rate

Throughput (/ s)

log in

50

28

20

38

0.00%

4.5977

Queries member information

50

1602

1292

2042

0.00%

4.07133

View Transaction

50

705

512

920

0.00%

4.37828

Create Order

50

86

60

119

0.00%

4.55083

overall

200

605

20

2042

0.00%

15.11716

Scene 1-10 Concurrency - cycle 5 times

 

label

Number of samples

Average (response time ms)

Least

Least

Error rate

Throughput (/ s)

log in

500

7612

40

26725

0.00%

15.84987

Query User Information

500

30871

2369

49719

0.00%

6.96233

overall

1000

19241

40

49719

0.00%

13.91517

Scene 1-500 Concurrent - cycle 1 times

 

label

Number of samples

Average (response time ms)

Least

maximum

Error rate

Throughput (/ s)

log in

550

8326

33

22360

0.00%

20.34851

Query User Information

550

36071

4362

58485

0.36%

6.7585

overall

1100

22199

33

58485

0.18%

13.51069

场景1-550并发-循环1

 

标签

样本数

平均(响应时间ms)

最小

最大

错误率

吞吐量(/s)

登录

4500

12408

87

46269

0.00%

4.68807

查询用户信息

4500

35383

3792

65036

0.00%

4.63027

查看交易

4500

22832

711

46812

0.02%

4.64518

创建订单

4500

24973

81

58698

0.13%

4.67591

总体

18000

23899

81

65036

0.04%

18.50308

场景2-450并发-循环10

 

4.2 系统吞吐量

 

 

 

 场景1-550并发-循环1

 

 

 

 

 场景2-450并发-循环10

 

4.3 资源占用率

最优负载条件下:

CPU使用率

 

 

 

  

内存占用率

 

 

 

 

磁盘使用率

 

 

 

 

5. 分析和建议

结合收集到的数据,给出对于系统性能关键点的分析

5.1 测试结论分析

经过多次测试和数据报表分析,可以得出如下结论:

1) 当总体并发用户数为450-500时,系统具有最优性能表现;当事务并发数超过500时,事务失败率整体上升,系统到达性能拐点。

2) 多事务混合条件下,系统巅峰TPS在90左右,平均吞吐量在13-18/s。

3) 在小压力条件下(10并发),最大事务响应时间为查询用户信息事务的2042毫秒,平均在600毫秒左右系统。整体事务微观响应速度较优。

4) 满负载条件下,登录具有最佳的性能表现,平均响应时间为7000-12000毫秒;查询用户信息事务性能较差,平均响应时间在30000-40000区间。满负载条件下系统整体微观响应时间较差。查询用户接口由于其使用极为频繁,建议进行SQL效率调优

5) 系统资源方面,内存占用率始终处于高位水平(90%以上),磁盘空间由于日志写入而不断被占用。

         

5.2 问题

测试过程中发现了如下显著问题:

1) 加密验签功能并未生效-现阶段任何签名均可通过验签。属于功能性问题,不影响性能表现。

2) 日志文件由于不断写入导致磁盘占满,建议调低系统日志级别,并做好定期日志备份。

3) 内存占用处于高位水平,需要进一步探查原因。

 

Guess you like

Origin www.cnblogs.com/dayu2019/p/11636279.html