How often do we have developers complaining about the bugs they get because of not having sufficient technical information? Bugs containing only the end result of the issue, but not deep analyses for such behavior? How often dev team spends hours in troubleshooting, looking into logs, trying to reproduce a defect? How often do we have Quality Assurance engineers spending hours to set up test data to validate a simple scenario? What if we could detect bugs in the development phase and save the company’s time from moving the tasks back and forth between the developers and QA?
EUT as part of the development team is taking the role to become a buffer zone between developers and QA so that each of the teams can deliver their max results. At the same time with development, EUT automates tests, executes tests in a minute time period, and detects bugs in the early phase. At a later time, EUT communicates and collaborates with QA for troubleshooting any bugs detected by the QA team. EUT makes deep analyses for the reported bug, provides requests and logs, though forwards the bug to the appropriate developer (backend developer, UI developer, or database developer).
- 60% of the bugs detected at an early phase, on dev environment, which results in less development time per task.
- Less DEV time spent in bug analyses and troubleshooting inside logs.
- Automated test data creation.
- Automated smoke and regression testing with the use of scheduled Jenkins jobs.
- Generated reports after test execution.
How does EUT process work?
1. General steps
- At the start of the project, EUT makes analyses of the Business requirements, flow diagrams, and legacy UI applications.
- EUT understands the system layers and their relationship, in order to properly do integration testing.
- On the day of sprint grooming, EUT makes analyses of the requirements in the user story and communicates with BA, DEV for any open questions. All existing questions are analyzed and resolved until the day of sprint planning.
- EUT creates sub-tasks containing time estimation for task analyses, test case design, and test automation so that the task timeframe can be easier predicted.
- On the day of sprint planning, EUT gives estimation in the form of a story points.
2. Functional testing
- While the development is in progress, EUT works together with the developer to get API specification such as:
– Request payload
– Required parameters
– DEV env. endpoint
– API authorization info
- Based on API specification and business requirements, EUT prepares the test cases in a:
– Jira Test type of task, or
– In a document spreadsheet (if previously agreed)
- Based on the prepared test cases, EUT develops automated test scripts. When doing automation, the focus of EUT is primarily on API/component/unit testing.
Based on business analyses, EUT is capable to generate valid test data and automate the process of test data creation. Furthermore, EUT is capable to execute DB queries to collect any other information needed to develop valid test data.
- Once the user story is completed by DEV, EUT executes the automated test scripts for the API and validates the implementation, on dev environment.
- In case of any issue, a Jira Bug is created and assigned to the appropriate developer. Solving one of the common problems, EUT provides the following details inside each bug:
– EUT label
– Steps to reproduce / scenario which is tested
– Endpoint of the API that is tested
– Example request
– Example response for the detected bug
– Log details
- Once EUT is done with testing and detected bugs are closed, the user story can be assigned to QA.
- In case of any QA bugs, EUT works together with QA to triage the bug and provide as many details to the developer, solving another common problem. EUT reproduces the bug on DEV env and collects the API details needed for DEV to be able to debug.
- In case the QA bug is not reproducible, EUT provides enough evidence and re-assigns the bug back to QA.
- Upon request, EUT can do automated end-to-end validation, by intelligently re-using the already developed tests for each unit.
- All automated tests are committed to a git repository, shared with the rest of the team members.
- All automated tests are run by scheduled Jenkins jobs.
- Automated reports are sent for smoke/regression testing as a result of a scheduled job in Jenkins.
3. Performance testing
- Upon request / Based on the JIRA task, EUT does API performance testing, by simply modifying the automated functional scripts.
- Performance tasks should include non-functional requirements so that EUT can prepare the scripts appropriately.
- Performance testing should be ideally executed on a PERF environment (if available).
Having outlined the entire EUT process, the steps that it includes, we can say that EUT is practically the problem-solver to all problems that appear during other models of testing. What we want to say is that it covers many aspects that other testing models miss to include and as such we strongly recommend it.