Is Acceptance Test Driven Development (ATDD) Worth the Effort?

Is Acceptance Test Driven Development (ATDD) Worth the Effort?

Got a few spare minutes? Come and read the best of The Testing Planet archive from Ministry of Testing

By Peter Karas

I’ve been implementing ATDD frameworks for a few years now for a number of different clients. The results of the work have been varied, and before starting on any new implementation I thought it was about time I did a personal retrospective and ask myself has all the work been worth it?

Sorry to ask, but what exactly is ATDD?

Good question! There is a lot of theory written about ATDD, justifying its use with reference to agile principles. However, sometimes it’s just difficult to ‘get’ it. I found it easier to understand ATDD by doing it, or the next best thing to read about the details of how someone else does it.

What I mean by ATDD in this context is fully automated user stories defined up front by the business and the development team (testers and developers) working together. These Acceptance tests then execute functions of the product either using Selenium Webdriver, or lower level HTTP / TCP calls to services. A framework like Cucumber or Concordion ‘glues’ the words of the acceptance test to the messages being sent to the service or UI (mouse clicks, HTTP calls etc.) As the business has collaborated with development to define the acceptance criteria for the user story (requirement) then when the test passes, the feature is deemed functionally acceptable to the business.

Sounds like hard work, why do it?

Yes, it is a lot of work. Implementing automated ATDD is a whole lot of work, some of which is quite technical and requires the effort of everyone from the development team. So that I don’t feel like giving up before I start, I’m going to remind myself why I’m doing it. Here’s what I believe are the main benefits of adopting ATDD on a project:

  • Better communication between the business and development teams.
  • Improved quality. Fewer defects on new features and fewer regression issues.
  • Increased developer productivity as a result of a more focused approach.

If we get it right, the payback is huge. Well worth the initial effort.

The results of my retrospection

Thinking about my work over the last few years in the context of these objectives allowed me to identify what I had been doing well and also some areas that I could improve in.

Payback is quick

I used to worry a lot, and still do about the business not being fully engaged with the writing of the tests. That’s the point of Acceptance tests, right? However, I often find myself using the specifications to document agreements in workshops and meetings and as part of continuous feedback from the business. On one large CRM system I worked on, business users rarely gave me feedback directly on user stories.

After some reflection, I realised that the ATDD approach is still worthwhile, even if it is only used within the development team. As a developer, I find specifying the requirement and then having that as my goal an extremely efficient way to work. I remain focused during the implementation of features and can clearly state what is within the boundaries of the specification and what aspects will require the most effort. In order to write the user story and acceptance test, I need to communicate well with the business. In time I hope they will engage more in the specification process, but right now talking and email are just as good.

Start like you mean to finish

Working on a data visualisation project I developed a set of Webdriver tests to crawl the user interface and click on the many possible combinations of links that made up the user interface. This was great as it exposed defects in the application and data that might not have been identified without tooled assistance. However because the tests were not part of an automated release process, they were never executed beyond their system test lifespan and so their value was limited.

So now, if I’m going to use an ATDD approach, then I aim to use it from the outset (requirements definition) and make sure I automate my tests so they can be used as regression checks. It often feels like lots of effort up front but I’m much more comfortable doing this now because I know that it will lead to increased productivity in the short-term (the payback is quick!)

Specification by example is best

I think specification by example is the simplest and most useful tool. Simply asking for example files for your tests to work with or mock-ups of data screens pushes the most pertinent questions back to the business. Often that one piece of data on the UI that can’t be specified exposes a whole requirement as problematic.

On a large billing system, I worked on we managed to avoid expensive development effort by not starting work on a payment channel until we had example messages for each use case agreed. I persisted in asking for actual files that can be read by a test harness.

Sounds sensible. But it was very tempting to start writing code. The agile approach encourages us to write adaptable software and it would have been easy to write the bulk of the code and develop a stub interface made up of assumed data. However, it turned out no payment gateways were able to supply us with complete example files as there were fundamental legal and compliance issues with our approach. By insisting on specific examples, we helped the business to pivot on this point and saved the project a whole lot of money in the process.

Be nice to your test harness

This is a strange truth. Acceptance tests break if they sit on the shelf for too long. If you’ve got a whole load of broken tests that no one wants to run, they soon gain the reputation of being a pig to work with, and therefore too expensive to use.

The solution is to implement a continuous delivery approach right from your first feature (start like you mean to finish). Your first test should be repurposed as a regression check right after it becomes obsolete as an acceptance test. When tests fail, fix the issue immediately. Tinkering and maintaining a harness over a long period is a much more cost-effective way to work than attempting to resurrect a set of stale acceptance tests at some point in the future. Ultimately you will start to treat your test code like production code. And yes, I have written tests for test code!


I’ll always write acceptance tests now. Even on the smallest projects, they pay back early, and having automation from the start just feels like a solid foundation for any project. Designing and developing a maintainable test framework is also fun as well as challenging.  Even if the business doesn't engage fully at the start, using techniques like specification by example is an extremely powerful way to get to the heart of a requirement, and could end up saving your organisation a whole load of money.

My learning matrix

Author Bio

Peter Karas is a software developer and has worked for consultancies servicing a variety of clients across public sector, telecoms, non-profit and finance sectors. He is interested in process improvement on software projects and how testing and automation can help improve team dynamics and software quality.

Explore MoT
Managing Distributed QA Teams: Strategies for Success
In an era where remote teams have become the norm, mastering the art of managing hybrid and distributed QA teams is more crucial than ever
MoT Foundation Certificate in Test Automation
Unlock the essential skills to transition into Test Automation through interactive, community-driven learning, backed by industry expertise
This Week in Testing
Debrief the week in Testing via a community radio show hosted by Simon Tomes and members of the community