Rahul Nema


Rahul Nema

Rahul Nema works as an Architect with-in the IBM Cloud organization, working on solutions and products to enhance the IBM Hybrid Cloud Platform. Currently engaged in improving Cloud Pak and OpenShift Container Platform deployments on Cloud Pak System which is a Cloud in a Box Offering, OCP made ready to consume. He has close to 15 years of experience in product development, DevOps, and architecture. He also has experience in implementing cloud solutions for multiple customers.

Topic: Adaptive Test automation in the Agile Test Cycle

In Agile development team, code churn rate is very high. Also, product owner feedback changes the design and function iteratively. Hence maintenance of function and integration tests is very unstable. Despite using CI/CD pipelines, adapting tests as per the new function requests is more difficult than actually adding that function, as the test systems need to determine if this function is robust and does not break the existing automation. Thus the test automation team is always playing catch up to the development. Often development team works around this by keeping automation team in sync with their development, but it slows the development cycle and makes it less fruitful.

In this talk, we will cover how Test automation can keep pace with an agile development team that is churning out constantly evolving code. Our current automation system takes advantage of latest tools to interpret the change, adapt to it, update the test report, analyse the extent of change before the CI/CD picks it for propagating through various levels. It brings proactive insights by looking at tags and keywords inserted by development team to help test engineer understand the degree and areas of change in depth. Once the code is in staging, test automation run intuitively, flags breaking points to recommend changes for automation engineer to adapt the flow, or alert the development team for fixes required in their code. It does static analysis of the automation code and calls attention in a visual for the entire team on how the development changes will impact existing automation bucket.

We use the below principles in our automation development-
1. With every check in, identify the code updates and suggest impact to existing automation scripts.
2. If any changes in test environment are needed, identify and update the test environment.
3. Identify the changes required in the test data and provide details to update the test data.
4. Generate a traceability matrix which will provide information if testcases/test suites for updated code are available or not.
5. If any defect is fixed with code check-in, generate a test script for defect verification.
6. Identify the areas vulnerable to test automation breaks through the newly introduced code and provide an alert for particular feature(s).

Full paper will cover the general direction for building such an end to end system, some snapshots of our design principles and a quick overview for the functional use-cases. We also plan to show a full flow chart of how the system was built, what pitfalls we encountered and then successfully stood up a system using AI/ML workflows, which adapts and works to enhance automated validations.

Brought to you by


Product Partner(S)


Other Sponsor(s)

Get your brand known across the world

Drop us an email at : atasupport@agiletestingalliance.org to sponsor