Reading:
Saying Goodbye To Those Flaky UI Tests – The 'Head & Shoulders' Way
Share:
You Know Code, We Know Security!
With a combination of SAST, SCA, and QA, we help developers identify vulnerabilities in applications and remediate them rapidly. Get your free trial today!

Saying Goodbye To Those Flaky UI Tests – The 'Head & Shoulders' Way

Steve Watson shares his experiences of how to deal with flaky UI tests with his 'Head & Shoulders' approach

One of the struggles with trying to automate UI tests is identifying the field elements. Names are inconsistent, sometimes using CSS Selector works, other times its Xpath, and it creates a real muddle. It makes automated tests flaky and you can find yourself spending just as much time fixing or updating tests instead of writing new ones.

We faced this problem with one of our customer facing websites, and decided to do something about it. This article will take you through the problem, how we solved it, our approach and how it has benefited our whole team.

The problem - flaky Selenium tests which fail when they are unable to find elements

Selenium is a great open-source tool for testing websites, and we have been using it with C# for some time as it matches the tester skill sets. However one of the problems that we face (as I'm sure you do) is the flakiness with finding the web elements. Something that we very rarely consider when writing a customer-facing website is how we are going to add automated tests and how they will identify the individual elements.

Tests end up being brittle and it doesn’t take a lot to make them break - the addition of a new field that it wasn’t expecting to find, the renaming or removal of a field, a change in the process flow etc. It's difficult to move forward on new features or tests when so much time is spent going back and fixing broken tests, and we found that a lot of time was spent on maintenance by the testers in one particular team.

My initial thought - ask the audience!

My role is to look after our test approach and make sure we follow good testing practices. This includes how we approach automated testing, so the fact that we had testers spending so much time fixing issues with flaky and failing tests and not adding new tests for new functionality was something I wanted to find a solution to.

If you have a flaky scalp, you need the right shampoo such as ‘Head & Shoulders’ to fix it, and you need the right solution to fixing flaky tests!

Initially I thought that the problem was that we hadn't explained to the developers how the automation worked, so they were unaware of what could break in the automation framework when making changes. The testers were also unaware of upcoming front end changes and their potential impact until the tests failed.

So, I had the bright idea of asking the audience on MoT, and posted a question asking how we could create a mapping between a web element and the automated test relating to it, as I thought this was possibly the solution that we needed - after all, wasn’t the problem one of mapping? If we solve that, then the testers are made aware of upcoming changes. Simple!

I was aware that I had written a rather clumsy question, with a solution to a problem rather than asking what options were available, so I was really pleased that people still responded to my post! Even better is that they saw past my 'solution' and made some really great suggestions of their own - i.e. have developers write the tests themselves, push the tests to a lower level, add automation to something that 'spits out the url/selector pairs' & use QA Selectors. I read the blog on QA Selectors (see link below) which made me think that this wasn’t just a mapping issue, it was more fundamental than that, and needed to be tackled in a different way.

Great minds think alike - let’s add unique ID's to each web element

At the time I was thinking about this and reading through the MoT replies, our front end developer and a newly appointed tech lead were also thinking about approaches to tackle the problem. Using their previous experiences, they came up with a draft approach and invited me and the testers within that team to discuss the plan.

The goal was to provide unique ID’s for automated tests while not oversaturating the codebase with ID’s, and these were the overarching rules:

All tables to have unique ID’s

All interactive elements such as links, buttons, checkboxes, input fields etc. to have unique ID’s (or their parent container does if the element itself cannot)

All major containers (divs, sections etc.) that contain significant portions/features of the site to have unique ID’s

All ids added for automated tests to have unit tests that protect these ID’s from future refactoring

Every page in the front end should have these ID’s added including error pages, where appropriate.

Each element was identified with a unique ID, as per this example:

 A form from Octopus investments that shows a table of different elements and their IDs

 

Each element type is given a unique ID following a defined naming convention and they added some helpful examples where there would be multiple data rows - e.g. where there is a list of clients, use clientsPageClient(n). If there were 4 clients, you would have clientsPageClient0, clientsPageClient1, clientsPageClient2 and clientsPageClient3 allowing the automated tests to identify each client individually.

Interestingly it matched Viv Richards' ideas about using ID's – and is logical, easy to follow, and means that each front end developer knows to a) follow the format and b) inform the testers if anything is to be added, changed or removed so that they can make the necessary changes to the Selenium tests.

The addition of unit tests to protect against refactoring gave us an additional layer of protection. We were less likely to have ID’s randomly changed, and can have confidence in the ID’s used within the Selenium framework.

The implementation - developers made the code changes, testers updated the mappings in Selenium

We had a conversation about the benefits of this approach with the front end developer, tech lead and automation tester on the product team, and it didn’t take us long to approve this approach as it made good sense. 

Almost immediately the front end developers added tasks to the backlog to implement, and the work was completed within a couple of sprints. It was a fairly quick turn-around as the planning had already been done up front. All that was needed was to follow the guidelines on each of the pages. 

The next step was for the tester to update the mappings within Selenium to the new ID’s, (as we were using the Page Object Model, this was a straightforward job to do as the elements only needed to be updated in one place), build the solution and run the tests.

Summary:

The automated tests are now a lot more stable, and the issues that we had with tests failing as they couldn’t find elements have all but ceased. That’s not to say we don’t have other challenges, but this has certainly saved a lot of time fixing these sorts of issues. Our plan now is to use this model with the rewriting of another front end application and bake in the ID’s from the start, which will give us an immediate advantage when we start to write our automated tests.

Resources 

https://vivrichards.co.uk/automation/qa-selectors-what-are-they-why-should-I-care

https://club.ministryoftesting.com/t/re-re-does-anyone-have-a-way-to-link-ui-code-changes-to-the-tests-in-the-automation-pack/58328

 

Steve Watson's profile
Steve Watson

Senior Quality Engineering Manager

I am currently Senior Quality Engineering Manager at easyJet, responsible for the testing approach across our Operations products. I have been in Testing for a number of years working in a diverse range of industries, and has written articles, blogs, spoken at many conferences, and hosted a discussion on robotics and testing during lockdown. Outside of work, I present a weekly Saturday morning show on a local radio station in Sussex



You Know Code, We Know Security!
With a combination of SAST, SCA, and QA, we help developers identify vulnerabilities in applications and remediate them rapidly. Get your free trial today!
QMetry Product Demo
Mastering Test Orchestration with Playwright
From Rags to Riches: Turning Your Test Automation Into a Cinderella Story - Niranjani Manoharan
Experience Reports - C# Edition
The Joy of Record and Playback in Test Automation - Louise Gibbs
Testing UI Component in Isolation
With a combination of SAST, SCA, and QA, we help developers identify vulnerabilities in applications and remediate them rapidly. Get your free trial today!
Explore MoT
TestBash Brighton 2024
Thu, 12 Sep 2024, 9:00 AM
We’re shaking things up and bringing TestBash back to Brighton on September 12th and 13th, 2024.
MoT Intermediate Certificate in Test Automation
Elevate to senior test automation roles with mastery in automated checks, insightful reporting, and framework maintenance