Building a successful regression checking strategy (or an automated regression testing strategy as it’s more commonly known) is hard, but it doesn't have to be. To avoid common mistakes of brittle checks, bad choices in checking tools and wasted time fixing broken checks, we need to take a step back and ask ourselves why we are running these checks and what risks are we are mitigating?
In this masterclass, Mark Winteringham shares his experiences of how to write an effective checking strategy that can answer these questions and in return help us to deliver a series of useful checking frameworks that assist our teams in a meaningful way.
Come to see what has been achieved after 18 months of work, and discuss about the results.
- Why it's important to build your own automation model rather than rely on others.
- How to let project and product risk guide what to automate rather than tools.
- How to implement your strategy as a team and not individually.
Mark is a freelance technical tester, testing coach and international speaker, presenting workshops and talks on technical testing techniques. He has worked on award-winning projects across a wide variety of technology sectors ranging from broadcast, digital, financial and public sector working with various Web, Mobile and Desktop technologies.
Mark is an expert in technical testing and test automation and is a passionate advocate of risk-based automation and automation in testing practises which he regularly blogs about at mwtestconsultancy.co.uk he is also the co-founder of the Software Testing Clinic in London, a regular workshop for new and junior testers to receive free mentoring and lessons in software testing.
Mark also has a keen interest in various technologies, developing new apps and Internet of thing devices regularly. You can get in touch with Mark on twitter: @2bittester