Reading:
The Awesome Power Of The Debrief: Why Debriefing Is The Key To Successful Exploratory Testing
Share:

The Awesome Power Of The Debrief: Why Debriefing Is The Key To Successful Exploratory Testing

Callum Akehurst-Ryan explores the value of debriefs at the end of Exploratory testing sessions

By Callum Akehurst-Ryan

Debriefing is a great skill to have as a tester. It allows you to share information with your teammates and unlocks the power of exploratory testing. It helps demystify the testing process by demonstrating the breadth and depth of the testing you do. I’ve had a number of developers surprised by the amount of knowledge you might benefit from having as a tester – psychology, marketing, coding, domain, design, business and more!

You can use debriefing to showcase your skills and let your teammates know that you can read and write code, pull apart an API, audit something for accessibility or debug an issue. In a world where testing shifts towards SDITs (Software Developer In Test), being able to showcase your technical skills in exploratory testing is vital. It demonstrates your value far beyond automating tests.

Debriefing also allows you to collaborate more with your colleagues. Sharing useful  discoveries on a regular basis helps build better products, rather than being removed from the process and left until the end. By using debriefs to engage with your colleagues and teammates you’ll find they’ll come to you for opinions, help with debugging, finding system information and to learn more about testing.

What is a debrief?

A debrief is a discussion between the tester and, mostly likely, someone who worked on the item under test. This might be a developer, a designer, another tester, a product person or even the CEO – it’s between anyone who has a vested interest in your testing discoveries.

Following an exploratory testing activity there is a wealth of information about the product or system under test. This might include things you found that were good, issues, questions and more test ideas.

Note: I’ll use the term “developer” as the majority target of the debrief throughout this post but, as mentioned above, this can be anyone who needs the information.

There are a number of different models that have been proposed for what a debrief discussion should constitute; but generally a successful debrief shares the following information:

  • What have I covered in my testing? To discover what might’ve been missed.

  • What was good? To identify where quality is good and to promote this.

  • What issues have I found? To start a conversation about how to improve quality.

  • What questions do I have? To learn more about the system which may lead to more test ideas.

‍

The debrief discussion is about encouraging conversation, it’s not about what is right and wrong but instead sharing what you’ve seen. Highlighting product behaviour discovered during testing – and subjecting the observations to critical analysis – you can help a developer identify and make product decisions, such as fixing bugs or improving code design.

‍

Ideally hold a debrief as soon as you’ve completed a testing session to help shorten feedback loops. However, the developer might not be ready to talk at this time so you might have to wait. When it comes time to debrief it’s good to have sufficient test notes to jog your memory and talk through.

Roadblocks image

Roadblocks to debriefing

Like other parts of testing activities there are likely roadblocks to overcome. Be mindful of the following obstacles:

1. Developers don’t have time to engage

You might be asked to “just raise the bugs and I’ll pick them up”. This might be because the developer hasn’t seen the value of the debrief. They might have too much work in progress at the moment or struggle to switch tasks.

2. Developers have been burnt in the past

Your team may have had testers who have acted as gatekeepers, who delight in telling developers what they’ve done wrong. Or sometimes such testers aren’t able to separate their opinion from their observations. And with such history there is a perception that these testers “are not technical” and aren’t helping to improve the product.‍

3. The whole-team doesn’t own quality

There’s a lack of engagement with issues and improvements. It’s common to hear:

“If it’s not exactly in the wording of the ticket then it’s out of scope”.

Frequently in these situations the developers have to check with the product owner or a senior developer to check if a change is “okay to do”.‍

4. Not being co-located

Organising a phone call to talk through testing can seem like a big deal or a waste of your colleagues' time, so it sometimes appears easier and speedier to just raise the bug or send over only the issues.‍

5. Disheartened testers “don’t have a voice”

A tester might just pass acceptance criteria instead of trying to find more information. If you’re feeling like you cannot raise suggestions or improvements, or that they just keep being pushed back, it’s much easier to just focus on checking over testing. In such a case it’s highly unlikely there’s anything to debrief.‍

6. The organisation doesn’t welcome an alternative approach to testing

The current approach to quality and managing changes might be about metrics or passed scripts. Or it could be that there’s a clear test/dev split and running waterfall so you can’t talk to the developer who worked on the product. In these circumstances there might be a reluctance to move towards what seems like a less planned and less scientific approach to reporting quality.‍

7. Too much information and misalignment

During a debrief a tester may attempt to provide too much information and overwhelm their audience. They may not get to the point or provide a useful summary of their discoveries. A tester might not understand their audience and what information is important to their audience. If this happens they might ruin their chances of running another debrief. 

Avoiding roadblocks

How to remove roadblocks to debriefing

The roadblocks to debriefing might seem daunting yet it doesn’t have to be that way. There are techniques you can use – in a new or established team – to help unblock your path. 

Know when to back out and try again

Now might not be a good time to introduce debriefing, or it just might not be a good time of the day to have a chat with a busy developer. It’s useful to raise in stand ups that you’d like to debrief someone or to send a message to them asking when would be good. Testing is a support activity and you need to be a trusted advisor. Respecting people’s busy times will help reiterate that you’re here to help.

Show the value by providing useful information

Testing can often be seen as a non-technical or unskilled job and as such people may not know what you have to offer or why a debrief is useful. Tailor the information you share during each debrief by focusing on what the team will be interested in, whether that’s a deep dive into the front end or into the technicalities of the code. Bring enough information to assist with a debugging activity. Ensure any found issues have been investigated to understand how to reproduce them and/or what the logs reveal.

Prove you’re on the same side by talking through the good

You can become a trusted advisor by showing developers you appreciate their work and pointing out good things.

"I’ve run debriefs where there were zero issues and I was just able to congratulate my colleague on some excellent work."

This helped build trust. It demonstrated I was trying to help and not “just here to find bugs”.

Having an opinion on everything is awesome, but you might have to tease them first

Similar to tailoring your information to be useful, you might want to tease at elements like accessibility or usability in a debrief and gradually build them into your debrief discussions. If you raise everything all at once then information might get lost in the mix and people may become overwhelmed and disengage from the debrief process. This doesn’t mean you shouldn’t try to find information out about these areas though, as they may be useful for future exploratory testing efforts.

Work at it

Teams need time to get used to new ideas and see what works for them. Don’t be discouraged by roadblocks or knock-backs. Persevere! Introduce the idea of debriefing and invite someone to a debrief in a stand up, make test notes and tell people that you’ve added test notes to cards or have them available to talk about. Try different styles and techniques and find out what works for your team. Make sure to keep exploring and making test notes even if you’re not currently debriefing to keep your skills sharp and because you never know when someone will say “yes!”‍

Testers have pride, but can’t be too precious

We want to feel like we’re making a difference and sometimes want an output to measure “what we’ve achieved” against – which is usually to find and raise bugs. It’s important to remember that as an exploratory tester your output is information, not bugs. When our testing doesn’t result in loads of bugs and fixes, we haven’t failed. Take your ego out of the situation. We’re here to support decision making and sometimes those decisions are “this is fine” and “we now know much more about what we’re building and can feed this into what we work on next.”

Notes with magnifying glass

How to run a successful debrief

“Okay, you’ve convinced me; debriefing sounds like a really awesome idea and I want to do it. But what do I do?” Fear not for we have the answers here.

Start with good intentions

Start by meeting with the developer at your computer or theirs. If you can’t meet in person, organise a convenient time to meet on a video call. Thank the person for their time. Arrive organised and prepared with no distractions. Be sure to turn off notifications. 

Have your test notes ready

It’s important to have notes ready to structure a good conversation about what you’ve discovered. Your notes can be in any format that works for you, whether as a mindmap, or a word document. Edit and present it in a way that’s going to make it easier to show information.

Talk through what you’ve covered

Use your test notes to walk through the sequence of what you’ve tested. Explain your thought process as to why you covered things, what test ideas you had and anything that you thought you’d come back to which weren’t part of your testing goal (also referred to as a “charter”). The goal is to give the debriefing recipient an idea of the scope of your testing. This will help you both identify areas you might have missed. During this walkthrough of coverage, point out issues and mention that you’ll come back to them.

Share the good

I colour code my test notes to make it easier to show where things are great. Usually when things are looking good I’ll short-hand to save time by grouping behaviour together and saying something like “which was all looking good”. However, do make sure to call out specifically good or clever things as this’ll show the developer that you value their work and this helps build team rapport. You can share the good as part of the coverage walkthrough as you’re going along and stop to admire points of supreme goodness along the way.

Describe the issues

Make reference to issues during the walkthrough of testing coverage and say that you’ll come back to discuss them in detail afterwards. To help with this, pull issues into their own section of the test notes so that you don’t have to keep hunting through the notes to find them, providing a smoother debriefing experience. When talking through the issues make sure to share details like you would when raising a bug, what was the issue, how did you find it, did you reproduce it, got any screenshots or logs? It can be a good idea to demo the issue as part of the debrief and potentially pair with the developer to start to debug it or find more information. You want to feed into decision making and not just prove you can find bugs, so raise issues in an engaging way.

Ask “I’ve seen this… is it something we’re worried about or want to fix?” to provoke a conversation, rather than saying “this is a bug and has to be fixed”.

Ask questions

Testing will lead to questions about the behaviour you’ve seen. These might be things you didn’t understand, new test ideas or issues you really weren't sure are a problem. Use this time to ask the developer these questions. Answers might lead to new test ideas and it's important to find out if  those test ideas are important for another round of testing.

Ask for feedback

Once you’ve discussed the scope, the good, the issues and questions you should check in and ask for feedback on what you’ve tested. I use questions like “is there anything that you think I’ve missed or should cover?”, “is this enough information for you, too little or too much?”, “Would you be confident to release?”. The aim here is to find out if you need more testing for this area and also to get information on the level of information that the developer likes to have, so you can tailor future debriefs for them.

Wrap up

Confirm if further testing is required, what things you agreed to fix and whether or not you are happy to accept and release the product if a release decision is imminent. Be sure to thank the developer for the debrief and congratulate them for their good work!

Tailor your debriefs to the right audience

For your debriefs to land, tailor your information-sharing for different types of audience and context. Here are three examples:

Image of developer with no time

The developer with no time

Keep it short and stick to the headlines. Ensure you give them the information they need the most. Ask what information would be useful instead of diving in and describing everything. The goal is not a status report of “oh look how much I’ve done” but instead to support them to improve quality.

Focus on high-level headlines without going into detail and if areas appear problem-free, summarize the section and move on to share a high-level overview of coverage. Pull out all the issues and go through them at the end; ensure you have technical logs or error messages to help diagnose things immediately. Limit stretch or push areas that you know might be pushed back as out of scope (like regression issues or things like usability improvements) and instead raise them separately.

The eager business stakeholder

The eager business stakeholder

Focus on customer and business risks. Use this as a chance to offer a product demo. Business stakeholders will be interested in the business elements of the product rather than in-depth technical explanations and will appreciate issues and benefits described in terms of business risk or benefits. Use this time to go into design, usability or accessibility issues in order to educate the stakeholders on how the customer will likely perceive and interact with the product.

Explain the exploration process by working through the steps you took, ensuring you don’t skip over anything. Offer likely examples of how the user will experience the product. Explore ideas together and suggest ways to mitigate things that threaten the value of the product. If you don’t have ideas to mitigate risk, introduce the conversation to a wider audience. 

The junior tester

The junior tester who wants to learn

Take time to show how and why you did things. You have an excellent opportunity to communicate your knowledge of the product and your knowledge of testing. Take time to describe your thought process of why you looked in a certain direction, the opportunities you took and when you kept close to the exploratory testing goal. Be patient and allow time for questions and to clarify points, this will help with understanding and learning. Also, use this debrief to help share how to run debriefs and how to make and use test notes effectively.

Talk about heuristics – the mental shortcuts you took to come up with test ideas – and any testing methods that you used. Describe how you used oracles to recognise what might be a problem. Share details of logs and API calls to share what to typically look for in these. Ask what the junior tester would’ve done differently. Learn from how they would have approached the exploration. Offer to pair test on future test exploration sessions to enhance learning.

An example debrief

If you need some additional guidance, take a look at this example of a debrief.

About the Author

Callum Akehurst-Ryan is a software tester at Improbable with over 13 years of experience across multiple domains from finance to public safety. His technical skills and keen interest in exploratory testing techniques are backed up with a passion for team engagement and the advocation for the integration of testers into agile teams.

Callum Akehurst-Ryan's profile
Callum Akehurst-Ryan

Principal Test Engineer

Throughout my 15+ year career I’ve had a variety of roles as a Test Engineer, Test Lead and Agile coach. As a result I have a well rounded approach to leadership in an Agile environment, both as part of test and the wider team. I specialise in full stack exploratory testing, embedding Agile testing into start-ups and quality reporting.



The Surprising Benefits of Exploring Other Disciplines and Industries - Conor Fitzgerald
99 Second Talks - TestBash San Francisco 2018
How to Defuse a Bomb... Wait, I Mean a Bug - Michele Campbell
An Eight-Layer Model For Exploratory Testing - Steve Green
Emojis: A Critical Path For Testing
TestChat 3: Discussing Exploratory Testing
What are Testing Notes?
99 Second Talks - TestBash Brighton 2019
Common Misconceptions About Exploratory Testing
Selenium 4 introduces relative locators. This new feature allows the user to locate an object in relation to another object on the screen! Don't wait, get an instant demo today.
Explore MoT
Episode One: The Companion
A free monthly virtual software testing community gathering
Cognitive Biases In Software Testing
Learn how to recognise cognitive biases, explain what they are and use them to your advantage in your testing