2022 National Vocational College Skills Competition (Higher Vocational Group) "Software Testing" Competition Task Book

2022 National Vocational College Skills Competition ( Higher Vocational Group )

"Software Testing" competition task book

July 2022 _ _ _

1. Competition time, content and results

(1) Competition time

The competition time of this stage is 8 hours in total. The contestants arrange the progress of the task by themselves. There is no special time for rest, drinking water, toilet, etc., which are included in the competition time.

(2) Competition content

The skills assessed in this competition include: test environment setup and configuration, application system installation and deployment; unit test design program code, design test data and unit test method, complete compilation and program operation, and take interface screenshots; test document design and writing; Functional test case design, Web and App application functional test execution and bug submission; automated test requirement analysis, test tool use, code writing and test execution; performance test requirement analysis, test tool use, test execution; interface test requirement analysis, test tool Use and test execution; professional qualities such as teamwork ability and application innovation ability.

(3) Composition of competition results

The competition score of the "Software Testing" competition is 100 points, of which professionalism accounts for 5 points and competition tasks account for 95 points. The weight of each competition task in the total score is as follows:

serial number

The name of the competition task and the weight of the total score

task one

Environment construction and system deployment, weight 5%

task two

Unit test, weight 10%

task three

Design test documents, weight 15%

task four

Design test cases, weight 10%

Execute test cases, weight 15%

task five

Automated testing, weight 20%

task six

Performance test, weight 15%

task seven

Interface test, weight 5%

Professionalism, weight 5%

2. Competition Notice

1. The address, user name and password of the competition platform, the address, user name and password of the system under test for functional testing, the address, user name and password of the system under test for automated testing, the address, user name and password of the system under test for performance testing, and The on-site distribution shall prevail;

2. All the results submitted for this competition and the U disk cannot contain the team information and contestant information. When the competition documents need to fill in the team and contestant information, use the station number instead;

3. Only the competition result files can be saved in the U disk submitted for this competition, and files unrelated to the competition results cannot appear in this U disk;

4. Please do not change the competition environment without authorization (including forcibly shutting down the competition server). Consequences caused by unauthorized changes to the competition environment shall be borne by the contestants themselves. Participants who maliciously damage the competition environment shall be dealt with according to the competition system;

5. Contestants must save the documents (Wrod and Excel documents) that need to be submitted in time during the competition. If they fail to save in time and cause the content of relevant documents to be missing or cannot be saved, the contestants shall bear the consequences themselves;

6. Environment construction and system deployment shall be carried out in strict accordance with "A1-Environment Construction and System Deployment Requirements". If problems are caused by modifying parameters or destroying the installation environment without authorization, the contestants shall bear the consequences themselves;

7. In the process of unit testing, the contents of JDK and its path settings in the system are deleted or modified without authorization; the project projects that have been built in Eclipse and the related jar packages that have been installed are deleted without authorization; the default directory of Workspace is modified without authorization; If the above matters are carried out without authorization, resulting in the consequences that the program cannot be compiled and run, the contestants shall bear it by themselves;

8. Automated testing scripts are written in strict accordance with the "A10-BS Asset Management System Automated Testing Requirements", and the contestants shall bear the consequences by writing infinite loops or damaging the environment scripts without authorization, resulting in system crashes or software problems;

9. For performance testing, please set the number of concurrency and execution time in strict accordance with the "A12-BS Asset Management System Performance Test Requirements", and the contestants shall bear the consequences of increasing the number of concurrency and prolonging the execution time without authorization;

10. During the performance test, contestants can reset the database or restart the Tomcat service as needed. After resetting the database, the data will be restored to the initial state at the start of the competition, and the results will be borne by the contestants themselves (JMeter and LoadRunner are strictly prohibited to access the performance test - reset the database and restart the Tomcat service address). The actions of resetting the database and restarting the Tomcat service are subject to the server log records. There is no special time (including on-site technical support) for resetting the database or restarting the Tomcat service, which is included in the competition time;

11. After the competition, please do not turn off the competition equipment. The consequences of data loss and other consequences caused by the contestants turning off the competition equipment shall be borne by the contestants themselves;

12. If any problems arise during the competition, please raise your hand to the on-site referee in time, so as not to affect other contestants.

3. Mission statement

(1) Competition environment

The competition environment is composed of server A, server B, client 1, client 2, client 3, client 4 and mobile phones.

Server deployment instructions: Deploy the competition platform (download documents, upload documents), function test system under test, automation test system under test in server A, and deploy performance test system under test in server B.

Client deployment instructions: Client 1 has installed VirtualBox, PostMan and other environments; client 2 has installed Eclipse and other environments; client 3 has installed PyCharm and other environments, and client 4 has installed LoadRunner, JMeter and other environments; all clients have installed WPS, input method, browser; Assets App has been installed on the mobile phone, connected to client 1 through a USB data cable, and connected to the wireless router.

Description of client access restrictions: Clients 1 and 2 can only access the competition platform and the system under test for functional testing; client 3 can only access the system under test for automated testing; client 4 can only access the system under test for performance testing; mobile phones can only access Access the functional test system under test (App side).

Instructions for using the client computer: competition-related documents can be downloaded and uploaded by accessing the competition platform on client computers 1 and 2; task 1 is performed on client 1; task 2 is performed on client 2; task 3 is performed on client 1, 2, Both 3 and 4 can be performed; task 4 can be performed on clients 1, 2 and the mobile phone; task 5 can be performed on client 3; task 6 can be performed on client 4; task 7 can be performed on client 1.

Description of the task corresponding to the system under test: tasks 1, 2, and 3 do not need the support of the system under test; task 4 is completed using the system under test for functional testing; task 5 is completed using the system under test for automated testing; task 6 is completed using the system under test for performance testing; Seven use the functional test system under test to complete.

(2) Competition task documents

serial number

document name

Document download location

1

A1-environment construction and system deployment requirements.doc

competition platform

2

A2-environment construction and system deployment report template.doc

3

A3-Unit Test Requirements.doc

4

A4-unit test report template.doc

5

A5-Test Plan Template.doc

6

A6-Test summary report template.doc

7

A7-BS Asset Management System Requirements Manual.doc

8

A8-Functional test case template.xls

9

A9-Functional Test Bug Defect Report Checklist Template.xls

10

A10-BS Asset Management System Automated Testing Requirements.doc

11

A11-Automated Test Report Template.doc

12

A12-BS asset management system performance test requirements.doc

13

A13-Performance Test Report Template.doc

14

A14-BS asset management system interface test requirements.doc

15

A15-Interface Test Report Template.doc

(3) Task composition

Task 1: Environment construction and system deployment ( 5 points)

1. Task description

According to the "A1-Environment Construction and System Deployment Requirements" document, complete the construction and configuration of test environments such as JDK, MySQL, Tomcat, and install and deploy application systems, and finally access the system successfully through a browser. Screenshots of the process and results are required. According to "A2-Environment Construction and System Deployment Report Template", complete the environment construction and system deployment report document.

2. Task requirements

(1) Environment construction and system deployment report documents should include the following:

1) JDK-related screenshots;

2) MySQL-related screenshots;

3) Tomcat-related screenshots;

4) Screenshots related to the application system.

(2) Environment construction and system deployment requirements:

Use the VirtualBox provided on client 1 to complete the environment setup and system deployment.

3. Task results

XX-A2-Environment Construction and System Deployment Report.doc (XX represents the station number)

Task 2 : Unit testing ( 10 points)

1. Task description

According to the "A3-Unit Test Requirements" document, write Java applications, design test data, write unit test scripts, use Eclipse to complete compilation and program operation, and take screenshots of the operation results. Follow the "A4-Unit Test Report Template" to complete the unit test report document.

2. Task requirements

(1) The unit test report document should include the following:

1) Program source code;

2) Test data and test method code;

3) Screenshot of unit test results.

Note: 1. The number of all test data sets must be the minimum to meet the test requirements. 2. In the process of unit testing, after starting Eclipse , use the default Workspace (cannot be changed), you must create a new one in GsTest-src-GsCode and complete the relevant code design (Junit, hamcrest-core, hamcrest-library and other jars have been included in Referenced Libraries After the package is imported, if the contestant deletes the jar package, the consequences will be borne by himself).

(2) Unit test requirements:

Use the Eclipse related environment provided on the No. 2 client to complete the unit test.

3. Task results

XX-A4-unit test report.doc (XX represents the station number)

Task 3: Design test documentation (15 points)

1. Test plan (7.5 points)

(1) Task description

Analyze according to the overall test requirements, divide and define the test scope, decompose the test tasks, and estimate the test risk, test workload and test progress for functional test, automated test, performance test and interface test tasks. Follow the "A5-Test Plan Template" to complete the test plan document.

(2) Task requirements

Test plan documentation should include but not be limited to the following:

1) Test overview: project background, writing purpose;

2) Test task: test purpose, test reference document, test scope;

3) Test resources: software configuration, hardware configuration, human resource allocation;

4) Test plan: overall test schedule plan, functional test plan, automated test plan, performance test plan, interface test plan;

5) Release standards;

6) Related risks.

(3) Task results

XX-A5-Test Plan.doc (XX stands for station number)

2. Test summary report (7.5 points)

(1) Task description

According to the overall test situation, aiming at functional testing, automated testing, performance testing, and interface testing tasks, analyze the overall testing process and obtain the final overall testing results. Complete the test summary report document according to "A6-Test Summary Report Template".

(2) Task requirements

The test summary report document should include but not limited to the following:

1) Test overview: project background, writing purpose;

2) Documentation of test results;

3) Test design: introduction to the design of functional test methods, introduction to the design of automated test methods, introduction to the design of performance test methods, and introduction to the design of interface test methods;

4) Test review: review of functional test process, review of automated test process, review of performance test process, review of interface test process;

5) Summary of use cases;

6) Bug summary;

7) Test conclusion.

(3) Task results

XX-A6-Test summary report.doc (XX stands for station number)

Task 4 : Functional Test (25 points)

1. Design functional test cases ( 10 points)

(1) Task description

Conduct requirement analysis according to "A7-BS Asset Management System Requirements Specification", understand business functions, and design functional test cases. Complete the functional test case document according to "A8-Functional Test Case Template".

(2) Task requirements

Functional test case documentation should include the following:

1) Summarize the number of functional test cases by module;

2) The functional test case should include the following items: test case number, function point, use case description, precondition, input, execution steps, expected output, importance, execution case test results.

(3) Task results

XX-A8-Functional test case.xls (XX stands for station number)

2. Executing function test cases ( 15 points)

(1) Task description

According to the "A7-BS Asset Management System Requirements Specification" and functional test cases, perform functional testing, find bugs, record bugs and take screenshots of bugs. According to "A9-Functional Test Bug Defect Report Checklist Template" to complete the functional test bug defect report checklist document.

(2) Task requirements

1) The bug report checklist document should include the following:

① Summarize the number of bugs by module and bug severity;

②Bug defect report list should include the following items: defect number, system under test, role, module name, summary description, operation steps, expected results, actual results, defect severity, submitter (station number), attachment description (screenshot) .

2) Browser requirements for web-side testing and mobile-side testing App requirements:

①Use Google browser (Chrome) to perform web-side function test (including interface test) on No. 1 and No. 2 clients;

② Use the pre-installed "asset management" APP in the mobile phone provided by the competition to perform mobile terminal testing (including interface testing).

(3) Task results

XX-A9-Functional Test Bug Defect Report List.xls (XX represents the station number)

Task Five : Automated Testing ( 20 points)

1. Task description

According to the "A10-BS Asset Management System Automated Test Requirements" document, identify and locate page elements, write and execute automated test scripts, and paste the scripts in the automated test report. Complete the automated test report document according to "A11-Automated Test Report Template".

2. Task requirements

(1) The automated test report document should include the following:

①Introduction: purpose, term definition;

②Automated test script writing: scripts for the first question, scripts for the second question, scripts for the third question, and scripts for the fourth question.

(2) Requirements for automated testing tools:

Use PyCharm installed on client No. 3 as a tool for writing automated test scripts.

Note: During the process of running the automated test script, if an error occurs (URL input error, positioning element not found, etc.), it is a script writing error, please adjust it yourself; when writing an automated test script in PyCharm, single quotes, double quotes, brackets and The point is to write in English; when pasting the automated test script into the automated test report, it must be consistent with the script format in PyCharm, and at the same time, do not paste all the code in one line or appear blank lines when pasting .

3. Task results

XX-A11-Automated Test Report.doc (XX represents the station number)

Task 6 : Performance Test ( 15 points)

1. Task description

According to the "A12-BS Asset Management System Performance Test Requirements" document, use performance test tools to add scripts, playback scripts, configure parameters, set scenarios, perform performance tests, and take screenshots of the test process and results. Complete the performance test report document according to "A13-Performance Test Report Template".

2. Task requirements

(1) The performance test report document shall include the following contents:

①Introduction: purpose, term definition;

②Test strategy: test method, use case design, test scenario;

③Performance test implementation process: performance test script design, performance test scenario design and scenario execution, performance test results;

④ Execution results.

(2) Requirements for performance testing tools.

Use LoadRunner and JMeter installed on client No. 4 as performance testing tools.

Note: 1. During the performance test, recording failure, playback failure, script execution failure, white screen, 500 error, etc. occur, which belong to the use or configuration error of the performance testing tool, please debug . 2. When using LoadRunner for performance testing, if a "Security Warning" pops up after clicking Start Recording, just click "Yes"; if "Root Certificate Storage" pops up after finishing recording, just click "Yes". For specific illustrations, refer to " Summary of LoadRunner Known Common Problems - VII.

3. Task results

XX-A13-Performance Test Report.doc (XX represents the station number)

Task 7 : Interface Test ( 5 points)

1. Task description

According to the "A14-BS Asset Management System Interface Test Requirements", use the interface test tool to send requests, variable settings, etc., and take screenshots of the test process and results. Complete the interface test report document according to "A15-Interface Test Report Template".

2. Task requirements

(1) The interface test report document shall include the following contents:

①Introduction: purpose, term definition;

② Interface test implementation process;

③Execution result.

(2) Interface test tool requirements:

Use Postman installed on client 1 as the interface testing tool.

3. Task results

XX-A15-Interface Test Report.doc (XX stands for station number)

4. Submission of competition results

1. Submission method

The task result documents need to be submitted on the competition platform and the USB flash drive at the same time ( all documents cannot be submitted in the form of a compressed package on the competition platform and the USB flash drive ). . Please follow the Contest Submission Documentation Checklist before submitting. Create a folder (for example, 01) with XX station number in the USB flash drive, and save all competition result files to this folder, and you will bear the consequences if you do not name it according to the requirements.

Note : Chrome is required to access the competition platform.

2. Documentation requirements

Team information and contestant information should not appear in all documents submitted for the competition. When the competition documents need to fill in the team information, use the station number; 21_02, 21 represents the station number, and 02 represents contestant No. 2).

3. Contest submission document checklist

serial number

File name (XX stands for station number)

submission method

1

XX-A2-environment construction and system deployment report.doc

Competition Platform and U Disk

2

XX-A4-unit test report.doc

3

XX-A5-Test Plan.doc

4

XX-A6-Test Summary Report.doc

5

XX-A8-Functional test case.xls

6

XX-A9-Functional Test Bug Defect Report List.xls

7

XX-A11-Automated Test Report.doc

8

XX-A13-Performance Test Report.doc

9

XX-A15-Interface Test Report.doc

Guess you like

Origin blog.csdn.net/qq_50377269/article/details/131985744