A better method of reporting the performance test results (translation)

 

Summary:

Reporting test results is relatively simple and functional, because they have a clear result or output by the failure. Report the results of performance testing of some more nuanced, but there are many ways to show these values ​​- but Michael ?? Starr felt that these methods are not particularly effective. He proposes a performance test results of a method easy to read report.

Effective reporting test results is the holy grail of one of our professional. If correct operation, it can improve the quality of the project and help us focus on practical things. But if the wrong operation, it increases the misunderstandings and reduce the value of testing brings.

The test results reporting functions is relatively simple, because these tests are output through a clear or a failure. Reporting performance testing is more nuanced.

Let's start with a definition: The purpose of this article, I use the term "performance test" means any test performs a measure that a series of numerical values ​​have been considered acceptable results. It may be measuring power consumption, the number of concurrent user site services, data can be read from the hard disk speed, and so on. - any measure a non-functional requirements.

The first performance test of the challenge is to decide what is considered "through." This is often overlooked in the requirements definition phase. I have seen many needs to interpret this: "time to extract data from the database must be less than 10 milliseconds", or "processing speed of a video file must be less than 100 frames per second." These requirements are incomplete because they do not contain the actual goals we want to achieve. We only know that we are still allowed to live with and the worst results of the product. There are two problems here.

First, let's assume that I perform a test and found the speed to handle video files in 101 completed (recall requirements are "at least 100 frames per second"). It looks good, right? But does it mean that we are close to the edge (it is difficult to meet product demand) or everything is okay? If the requirements are well defined, it will contain a minimum value and the target - for example, the target: 120 frames per second; minimum: 100 frames per second. There is such a demand for 101 per second results clearly imply that the product is difficult to meet the demand.

Second, when the test minimally failure (such as 99 per second), the product manager will be in a "flexible" and accept the pressure of the product. How often we hear, "Indeed, we are below the minimum, but we often pass, so we decided it was good"? If the full requirements can be obtained (target: 120 frames per second), we will see the results more clearly how far away from the target, and the product will be a real issue.

In order to benefit integrity, I will mention a non-functional requirements and minimum requires not only specific target, but also test methods because test methods affect the test results. For example, when measuring CPU usage, depending on how we perform measurements, the results will vary greatly. Are we measuring the maximum recorded? Time for how long? We calculate the average of measuring it? How many measurements have one second? Our tests What else run in parallel on the CPU it?

In theory, reporting performance results simply should not be a problem. Only showing results and pointed out a pass or fail. Again, however, we not only want to know the result; we want to get an idea of ​​how the results related goals. Make a report not overly complex, but still want to send a complete picture of the state is a balanced approach.

We can use a table:

demand

aims

Minimum

result

Video processing speed (frames per second)

120

100

101

In any case, because most products have many performance requirements, we will be the end of a large table filled with numbers. It is difficult to quickly see where a problem. We can use color to improve readability:

demand

aims

Minimum

result

Frame processing speed (frames per second)

120

100

101

CPU utilization rate (%)

7

10

8.55

Performance consumption

1.5

1.9

1.34

But this brings more problems. It means that the CPU utilization and the frame processing speed with the same color code? An almost fails, the other when the well in an acceptable range. It may frame processing speed marked in red? But then we use what color represents a failure? And we consider how long it should result in changes to yellow and green in front of it? Not to mention the difficulties occur, because some people have color blindness.

When my doctor my blood checked every three years (I meticulously do this thing) annually sent to me, I am considering this matter. In any case, the results from the laboratory to include a list of dozens of number of shows this table:

 

 

 

 

Even though I am not a suitable method physicist, I can distinguish good results, which is the boundary, and which is something I should discuss with your doctor.

In my mind a light bulb go ahead: why not use this method to report performance test? I pointed out some of the data points and presentation with slides:

characteristic

fraction

energy consumption

 

 

 

Transmission / meters per second

CPU usage

Memory Usage

Note that I still use color, but the axis explains the choice of colors and hinted where the highlight will be better and where the color is darker would be better - a separate method. Readers can clearly see the location of each measurement is within the allowable range; color primarily serving the attention in the problem areas. Making such a report may take some time, but it can be automated.

I have yet to see the realization of this idea in the actual project - I'm still studying the idea - but if you do use this idea, I would be pleased to know the reaction of your experience and your organization.

Guess you like

Origin www.cnblogs.com/fengye151/p/11519067.html