The easiest way to design test cases

NO.1<<Test case design method>>

  The following are eight common test case design methods, including equivalence classes, boundary values, decision tables, cause and effect diagrams, orthogonal experiments, state transition diagrams, scenario methods, and error speculation, as detailed below.

  Equivalence Partitioning:

  Strategy: Divide the input values ​​into mutually equivalent classes and choose a test case to represent each equivalence class.

  Description: Cover each equivalence class with representative test cases to reduce redundant testing.

  Example: Suppose there is a login page with two input boxes for username and password. The user name input can be divided into equivalence classes: legal user name, empty user name, and illegal user name; the password input can be divided into equivalence classes: legal password, empty password, and illegal password. Then select a test case from each equivalence class to test, such as selecting a combination of legal username and legal password.

  Boundary Value Analysis:

  Strategy: Focus on boundary cases of input values, i.e. minimum, maximum, and values ​​close to the boundary.

  Description: Test for boundary values ​​and special cases close to boundaries where errors are often found.

  Example: Suppose a function accepts an integer parameter in the range 1 to 100. Select boundary-value and near-boundary test cases, for example, select 1, 2, 100, 101 as input values ​​to cover boundary-value and close-to-boundary cases.

  Decision Table:

  Strategy: According to the system's decision logic and combination of conditions, design a decision table to generate test cases.

  Description: Organize the conditions and corresponding actions of the system into a table form, and then select test cases according to the combination of conditions.

  Example: Suppose you have a decision table that judges the weather based on temperature and humidity. Conditions in the table include temperature and humidity ranges, and actions are weather judgments. Design test cases to cover different combinations of conditions and corresponding actions.

  Cause-Effect Graphing:

  Strategy: Design test cases by drawing causal diagrams to visualize the causal relationships between system functions and inputs.

  Description: Design test cases by considering causal relationships between system functions.

  Example: Suppose there is an e-commerce platform where users filter products based on price and rating. Draw a cause-and-effect diagram showing price and rating as input criteria, and filtered items as output. Design test cases to cover different price and rating combinations.

  Orthogonal Testing:

  Strategy: Design test cases using orthogonal tables to cover different combinations of inputs to the system.

  Description: Ensure coverage of as many input combinations as possible with a minimum number of test cases by selecting a set of orthogonal tables.

  Example: Suppose there is a registration page where the user needs to enter a username, password and email address. Using an orthogonal form, select different combinations of usernames, passwords, and email addresses to test.

  State transition diagram (State Transition Testing):

  Strategy: According to the state of the system and the transition relationship between states, design test cases.

  Description: Draw a state transition diagram to show system states and state transitions, then design test cases to cover different states and transition paths.

  Example: Consider an elevator system with three states: open, closed, and running. Design test cases to cover various transition paths from one state to another by drawing a state transition diagram.

  Scenario-based Testing:

  Strategy: Design test cases based on actual usage scenarios, considering the actual operation and usage of users.

  Description: Design representative scenarios and corresponding test cases to simulate real-world usage.

  Example: Suppose there is an online shopping website, and the design scenario is as follows: users log in, browse products, add to shopping cart, and settle orders. Then design corresponding test cases to simulate these scenarios.

  Error Guessing:

  Strategy: Based on the tester's experience and intuition, speculate on possible errors and design test cases to verify these guesses.

  Description: Based on past experience and knowledge, testers speculate about possible errors and design test cases to verify as many of these guesses as possible.

  Example: Suppose there is an email sending function, the cases where the tester may guess wrong include: sending empty emails, sending emails with special characters, sending emails exceeding the limit size, etc. Design test cases to verify these error conditions.

  Below is an overview table summarizing these eight test case design approaches and their examples:

 

Finally, I would like to thank everyone who has read my article carefully. Reciprocity is always necessary. Although it is not a very valuable thing, you can take it away if you need it:

These materials should be the most comprehensive and complete preparation warehouse for [software testing] friends. This warehouse has also accompanied tens of thousands of test engineers through the most difficult journey, and I hope it can help you! Partners can click the small card below to receive   

 

Guess you like

Origin blog.csdn.net/okcross0/article/details/131961938