Summary of interface testing and arrangement of use case design methods

 Interface Test Summary Document

  Part One: Mainly starting from the problem, introducing the relevant content of interface testing and making a simple comparison with front-end testing, summarizing the differences and connections between the two. But this part only explains how to do it and how to do it? Didn't explain why?

  Part Two: Mainly introduces why interface testing is required, and briefly summarizes the interfaceContinuous Integration and interface quality assessment related content.

first part:

First of all, during the process of interface testing, back-end developers often ask:

What are the backend interfaces tested for? How was it measured?

The back-end interface is tested once, and the front-end is also tested. Are the tests repeated?

Therefore, in order to explain the above issues to developers and popularize basic testing common sense, we specially sorted out the relevant content of interface testing and the difference between it and front-end testing, so that the development team and testing team can reach a basic consensus on testing and improve team collaboration. efficiency, thereby better ensuring product quality.

Then, we try to answer the above questions:

Question1.1. What are the backend interfaces tested for?

--To answer this question, we can start from the perspective of interface testing activity content. Take a look at this picture, which basically reflects the main content of the current back-end interface testing of our project:

Question 1.2. How do we do interface testing?

--Since the front-end and back-end calls of our project are mainly interfaces based on the http protocol, when testing the interface, we mainly simulate the sending and receiving of http requests through tools or code. There are many tools such as: postman, jmeter, soupUI, java+httpclient, robotframework+httplibrary, etc.

Question 2. The back-end interface is tested once, and the front-end is also tested. Is it repeated testing?

--To answer this question, we can directly compare the content of interface testing and app-side testing activities. The following figure shows the content that needs to be covered or considered during app testing:

 From the comparison of the two pictures above, we can see that the same parts in the two testing activities includefunctional testing, boundary analysis testing and performance testing, other parts require special testing due to their different characteristics or concerns, and will not be discussed here. Next, we will analyze the same content in the above three parts:

1, basic function test:

Since it is testing basic business functions, this part is the one with the highest degree of overlap between the two tests. Development students usually mainly refer to this part of the content.

2, boundary analysis test:

On the basis of basic functional testing, the boundary conditions of input and output are considered. This part of the content will also have repeated parts (such as the boundaries of business rules). However, the input and output of the front end often provide fixed values ​​for users to choose (such as drop-down boxes). In this case, the boundary range of the test is very limited, but there is no such limitation in interface testing. Relatively speaking, The interface can cover a wider range, and similarly, the probability of problems with the interface is also higher.

 3. Performance test:

This is easier to distinguish. Although both require performance testing, the focus is indeed very different. App-side performance mainly focuses on mobile phone-related features, such as mobile phone CPU, memory, traffic, fps, etc. The interface performance mainly focuses on interface response time, concurrency, server resource usage, etc. The strategies and methods of the two tests are very different, so this part needs to be tested separately. In theory, this is also a different part.

Summary:

1. The activities of interface testing and app testing have some repetitive content, mainly focusing on business function testing. In addition, the tests for each characteristic are different, and targeted tests need to be conducted separately to ensure the quality of the entire product.

2. Interface testing can focus on server logic verification, while UI testing can focus on page display logic and interface front-end and server integration verification.

the second part:

1. What is interface testing?

Interface testing is a type of testing that tests the interfaces between system components. Interface testing is mainly used to detect interaction points between external systems and systems and between internal subsystems. The focus of testing is to check the exchange of data, transmission and control management processes, as well as the mutual logical dependencies between systems, etc.

2, Why do we need to do interface testing?

a) Today’s system complexity continues to rise, the cost of traditional testing methods has increased sharply and testing efficiency has dropped significantly. Interface testing can provide a solution in this case.

b) Interface testing is relatively easy to implement automated continuous integration, and is relatively stable compared to UI automation. It can reduce the labor cost and time of manual regression testing, shorten the testing cycle, and support the rapid release requirements of the backend. Continuous integration of interfaces is the root of low cost and high profits.

c) Nowadays, the front-end and back-end architecture of many systems are separated. From a security perspective:

  1. Relying only on the front-end for restrictions is completely unable to meet the security requirements of the system (it is too easy to bypass the front-end). The back-end also needs to be controlled. In this case, verification needs to be done at the interface level.

2. Whether the front-end and back-end transmission, log printing and other information are encrypted and transmitted also needs to be verified, especially when it involves users’ private information, such as ID cards, bank cards, etc.

3, interface test continuous integration:

 For interface testing, continuous integration automation is the core content. Only by maintaining automation can we achieve low cost and high profits. At present, we have implemented interface automation, which is mainly used in the regression phase. In the future, we need to strengthen the degree of automation, including but not limited to the following:

a) In terms of process: Strengthen the coverage of interface abnormal scenarios in the regression phase, and gradually extend to the system testing and smoke testing phases, and finally achieve full-process automation.

b) Result display: richer result display, trend analysis, quality statistics and analysis, etc.

c) Problem location: Error messages and logs are more accurate, making it easier to reproduce and locate problems.

d) Result verification: Strengthen automated verification capabilities, such as database information verification.

e) Code coverage: Constantly try to move from the current black box to the white box to improve code coverage.

f) Performance requirements: Improve the performance testing system and monitor whether the interface performance indicators are normal through automated means.

4, interface test quality evaluation standards:

  a) Whether the coverage of business functions is complete

  b) Whether the coverage of business rules is complete

c) Whether parameter verification meets the requirements (boundaries, business rules)

  d) Whether the coverage of interface exception scenarios is complete

  e) Whether the interface coverage meets the requirements

  f) Does the code coverage meet the requirements?

  g) Whether the performance indicators meet the requirements

  h) Whether the safety indicators meet the requirements

 

Interface test case design

1.    Use case design process:

 

Rome was not built in a day, and use cases are not completed at once; writing test cases itself is a step-by-step process, just like improving the code.

First of all, you must read the requirements specification and interface design documents thoroughly, understand the specific usage scenarios of each interface, and understand the performance indicators of the software.

Secondly, design interface test cases: At the beginning of the coding phase, testers design interface test cases based on the requirements specification and interface design document.

Then, code review: After the development and coding are completed, code review should be conducted if there is sufficient time. On the one hand, it is to check whether the functional logic of the developed code is correct, and on the other hand, the interface test cases are supplemented by reviewing the developed code.

      Finally, after completing the use cases, as the understanding of the system increases, the accuracy of the use cases will be continuously improved. The test cases need to be reviewed regularly. Once the test requirements change, the test cases must be re-maintained.

2. Interface test case conception structure:

Phase 1: Development involves coding and testing to obtain requirements documents and interface design documents:

1, basic functional test (business test):

Based on the translation of requirements documents and interface design documents, it is necessary to understand the business process rules and usage scenarios of each interface, and design use cases that conform to the business logic and interface usage scenarios.

2. Boundary analysis test:

On the basis of the basic functions, start to consider the impact of the interface input and output parameters. Equivalence class division and boundary value analysis methods are mainly used.

l Covers all required parameters

l Combination of optional parameters

l Whether the parameter is present or null

l The order, number, and type of parameters

l Parameter type value size, input value range

l Parameter string length, Null-max-max+1

l Parameters contain special characters

3, parameter combination test:

      Based on boundary analysis, various combinations of input conditions and mutual constraints between input conditions are considered. Mainly use cause and effect diagram method for use case design.

4. Abnormal situation testing:

Whether the interface implementation handles exceptions. Although the interface input parameters are legal, exceptions may also occur in the interface implementation, because internal exceptions are not necessarily caused by the input data, but may be caused by other logic. The program needs to handle any exceptions. For example: a certain interface needs to log in to obtain the session first. If the interface is called directly, a corresponding prompt should be given.

5, power level test:

     To put it simply, it is necessary to test the situation of continuous repeated submissions, especially scenarios involving transaction amounts, to verify how the software handles it.

6. Concurrency testing:

   When two or more users operate the same scenario at the same time, it may lead to resource contention, deadlock and other phenomena.

 

7. Transactional testing:

 A business process contains multiple operation steps. If an operation fails, the entire operation needs to be rolled back. Or call the reverse interface of the previous step to cancel the operation.

 

8. Testing with large amounts of data

 When the amount of data in the database is large (millions), test the efficiency of adding, deleting, modifying, and querying operations on the DB.

9. Environmental abnormality testing

  When the associated system is down, times out, or unresponsive, the interface returns correct prompts, business logic is correct, and there must be no transactional inconsistencies.

Phase 2: After the development and coding are completed, and there is sufficient testing time, the developed code needs to be code reviewed.

1.      reviewWhether the actual business logic of the developed code is correct

2, implicit condition test:

 Conduct a code review to check whether there are any implicit default conditions in the code. For example: the getRecommendArticleList interface in Project F, the default query in the code returns 4 records (as shown below), but it is not mentioned in the interface document. If the developer does not review the code and does not tell us, this situation will definitely be missed. Measurement.

3, SQL exam:

For interfaces that require database operations, check the relevant SQL and verify the correctness of the SQL. As shown below, generally SQL filtering conditions will be more than what the development tells us, so checking the SQL for verification is the safest way. In particular, it is necessary to design a scenario of combined conditions for verification:

3. Test process verification points:

1, fixed number of exchanges

a) Return whether the hierarchical relationship of json data is consistent with the document

b) Numerical type data: especially amounts, negative numbers, decimals, whether they are converted to json and the output is correct

c) The data returned by the interface is consistent with the interface document

d) The data returned by the interface is consistent with the database

e) The data returned by the interface conforms to the business logic (such as the transfer function, deducting money from one account and adding the corresponding amount to another)

f) For lists, you should verify whether the length of the list is consistent with the expected value based on the request parameters.

g) Negative test cases should verify whether the ERROR INFO matches the actual

2, number of units

a) The consistency of the data passed in through the interface and the data inserted into the DB:

b) When a front-end operation involves multiple tables in the backend DB, each table must check the data correctness.

3, safety layer:

a) When the data returned by the back-end interface to the front-end contains sensitive information (such as name, ID number, card number, mobile phone number, encrypted password, etc.), it cannot be transmitted in clear text and needs to be encrypted.

b) Background logging requires that sensitive information cannot be printed, or it must be desensitized with an asterisk before being printed. The details are:

1) ID number, user password (including encrypted), user mobile phone number, user name, bank card number

2) When the desensitized field of the ID number is birthday, the birthday cannot be printed in the log.

4, performance level:

a) Interface response time: The time it takes for the interface to process data is also a point that needs to be paid attention to during testing. Internally involved is the optimization of algorithms and codes

b) Interface packet size: The size of the data packet transmitted by the interface also needs to be paid attention to, especially the interface returned to the front end. The size of the data packet for different interfaces needs to be limited.

c) Concurrency carrying capacity: When multiple users are concurrent, the interface can carry the concurrency amount in the contract.

 

Thank you to everyone who reads my article carefully. There is always a courtesy. Although it is not a very valuable thing, if you can use it, you can take it directly:

These materials should be the most comprehensive and complete preparation warehouse for [software testing] friends, and this warehouse also accompanies Thousands of test engineers have gone through the most difficult journey, and I hope it can help you!Friends in need can click on the small card below to receive it 

 

Guess you like

Origin blog.csdn.net/okcross0/article/details/134994582
Recommended