Reading:
Why software testers should write documentation even if no one else reads it

Why software testers should write documentation even if no one else reads it

Cultivate professional trust and accountability by using detailed documentation to demonstrate due diligence and quantify risk for stakeholders.

Why software testers should write documentation even if no one else reads it image

When you're a tester, a large part of your work is exploring and reporting back on your findings. Research, analysis, and ultimately the documentation that comes from it is part of the craft. Often test plans, requirements documents, risk analysis reports, and bug reports are mandatory. Creating a message that effectively persuades stakeholders and alters the product for the better is a key part of what defines a great tester.

And yet: are these not common experiences?

  • A reply to an email that addresses only the first item (if any)
  • “Fixed” bugs that fix only one part of the issue, ignoring the other four
  • Being asked repeatedly "what was tested?" and having to recite the written test plan
  • Stakeholders are surprised about issues in production, even when bugs were reported before deployment

“As per my last message…” is a common joke punchline for a reason. I've had a longstanding theory that people simply do not read.

As a tester, this is difficult to rectify. We write lots of documents. We research test tools and test methodologies. We write test strategies for features, highlighting the risks and mitigation strategies. We write step-by-step test cases, documenting our tests in painful detail. We write bug reports for every occasion. We write reports on test coverage and quality metrics.

If we understand intuitively that people will not read documentation, why do we write so much of it? Over time, I've come up with three reasons to continue putting words to the page.

Reason 1: documentation is helpful for my own future reference

A case of future reference when serving popcorn

I bring a popcorn machine to many events. When the popcorn is ready to eat, you must open the door of the machine to serve it. There's a catch tray that folds out under the door, but it's far too small to serve from. People still try to serve from it, and they make a mess every time. I could see how the design might be confusing. So I put a sign on the machine:  “Open Door to Serve.”

People pressed their face against the glass – against the sign – and still tried to serve the popcorn out of the tray without opening the door first. So I can truly say that I've seen people press their face into the documentation rather than read it. 

I now have a note to myself: “close the serve tray to force people to open the door.” That note is documentation for an audience of one: me. I am the best audience for my documentation. 

Often we are our own best audiences for our documentation

Testers routinely analyze the products they work on. Among our tasks: 

  • Mapping of functional components
  • Historical bugs and risk
  • Heuristics and oracles
  • Test tool research

And we usually find that documenting the results of our research benefits us greatly.  Ultimately these analyses and many others create a solid foundation for testing. 

I heard about a team whose product needed to support multiple web browsers. Unsurprisingly, the costs of automation tooling and manual regression testing for the product ran up quickly. But some analysis of functional components, historical bugs, and customer heuristics revealed that the product needed to support only Chrome; customers did not have an actual need for other browsers. The findings were documented clearly. This saved thousands of dollars in testing. 

You need those documents. You will refer to them often now and later. And you may need them to prove a point you're making. Don't worry that no one else will read them. You will.  

Reason 2: documentation can serve as grounds for policy enforcement

A case of enforcement from road signs

My street has been under construction for months. As part of the construction, they “closed” the street. At first they put up “Local Traffic Only” signs. These were ignored completely. It was not long before a speeding motorist collided with a construction truck. Next they put up “Road Closed” signs. Less ambiguous. These were, shockingly, also ignored. One car jackknifed over the half-finished curb when the driver attempted to swerve around the sign. Finally, large cones were placed at both ends of the street, physically blocking traffic. People required physical blockage to enforce a critical rule.

Everyone who lived on the street understood the need for the cones, because we had seen the risk. The documentation in the form of road signs was not enough, but it served as the grounds for enforcement by way of the cones. We could point to the prior incidents as to why enforcement is required. 

Documentation can provide evidence for policy enforcement

Documentation often details an accepted process. These documents take forms like:

  • Quality lifecycle management
  • Testing and test automation execution
  • Release requirements / definition of done
  • Clean code / architecture standards

All process documents assert the need for, and commitment to, following a process. Testers document incidents and try to understand the cause and effect. The documents we write often become the bases of rules and processes. 

Until a process is enforced, the documentation itself is key. As we create incident reports, they become harder to ignore in number and scope, and the case for enforcement becomes more clear to key stakeholders. The documentation gives clear evidence to support enforcement.

After a process is enforced, that same documentation serves as a reminder as to why we created the rule. Organizations will often return and question the process. They fear it slows them down or produces waste. And in some cases, this may be true, in others not so much. The documentation serves as a guide.

One company I know removed the safeguards and allowed merging code directly to production. It was not long before a critical issue brought down a customer environment, the result of a faulty merge and dirty code. It was then people remembered why we had that safeguard in place.

Reason 3: documents help build trust in testing work

A case of trust from a syllabus 

“Why did you miss class today?” an old email from my professor asked. 

Why did I indeed? There was class today? How could I forget, it's every Wednesday! Why wasn't it on my calendar? The panic began to set in. 

After a short panic, I reviewed the syllabus. That day was marked as "no class." Clearly the professor had changed his mind at some point. I didn't go to class because I had read the documentation (the syllabus) and I was pardoned for the offense by providing that proof. 

Quality is built on trust. Unlike software, QA does not produce something tangible, something easily quantifiable. Test automation is only a part of the craft. Quality is something that, to an extent, must be believed in. In many organizations, that trust is tenuous at best.

Documentation can help ensure stakeholder trust in testing

Documentation can build trust by proving due diligence at critical points in the software development life cycle. These documents take the form of:

  • Test plans / strategies / cases
  • Bug reports
  • Risk assessments
  • Requirements / acceptance criteria

When a release date nears, stakeholders must make a choice to release the product. To do so, they need to understand the risks: what was built, how it was tested, any outstanding issues, and the overall risk of releasing now versus later. 

One of a tester's primary responsibilities is to quantify that risk, to ensure that the people making that release decision are well informed. Documentation shows our diligence in testing when something goes wrong in production. When there is a customer escalation, an outage, a critical bug in production, testers are among the first to be questioned. How was this missed?

There's nothing worse than being at the end of a pointed finger. It is something of a rite of passage for a tester. When that root cause analysis, incident report, or inquiry comes their way, testers need to show what was tested, why, and the known risks. Showing that you did your due diligence builds trust over time. 

Documentation: an unreliable AI-powered future

Currently AI is a powerful tool. Its ability to consume large amounts of data across systems is a strong asset. AI can find gaps in test strategies, identify missed requirements, enforce code standards, and link tests to bugs. Will docs generated by AI lead people to read more in general? What about reading testers' documentation?

Will text generated by AI lead people to read more in general?

Can AI boost literacy? 

"Sophomore" is one of my favorite words. It comes from the Greek meaning "Wise Fool." In some places it is used to refer to second-year university students, the implication being they have learned just enough to be overconfident and are prone to jumping to conclusions. 

AI is commonly used as a reading aid by "summarizing" articles into bite-sized bullet points that people appear to be more willing to read. To that extent, reading a summary is better than not reading at all. Caution should be used, however; summaries leave out key pieces of information, or may present the wrong conclusions. AI is known to hallucinate, which is to say, simply lie and make up what it wants without itself reading the document. To that degree, it may make the problem worse; people will read an inaccurate summary of a document and claim to have read it.  "Sophomorism" in this environment will be a real issue. 

I fear that with the current avalanche of AI-generated text, future documentation will be deeply flawed and untrustworthy. Quality engineering is no stranger to this; we test documentation all the time by applying what a document says to the truth of the system under test. 

Documentation is supposed to bring clarity, but AI only brings more questions. It has the power to alert you when documentation may be relevant or needs to be updated as circumstances change. It does not have the power to replace your reading or writing skills. 

Should we use AI to generate documentation instead of writing it ourselves?

Should we let AI do the "menial labor" of writing documentation, especially since it so often goes unread? Reflecting back on the three reasons for writing documentation above, how much can AI assist? 

Documents that are helpful for my own future reference 

AI is certainly capable of creating summaries of analysis and to perform the analysis itself. My question is, if this documentation is for my reference, if I used AI to write it, am I going to remember it? 

I find I'm able to retain and refer to information better if I write it down myself. By writing it myself, I can focus on what is important to me. What's more, it is written in my own wording, a way I can understand. Coming back and reading an AI-generated analysis document would be the same as reading the original source: a document whose contents I have no memory of. 

Documents that serve as as grounds for enforcement

Here the documentation backs decisions to enforce carefully crafted processes. Enforcement requires authority. An AI can certainly give you a definition of "done," for example, but what authority does it have? If someone questions why, I would not accept "because the AI told us to" as an authority. 

Documents that help build trust in testing work

Trust is based on accountability. Someone will trust my test plan because if something fails, they can hold me and the test plan I wrote accountable for that failure. If something fails, and I claim that AI wrote the test plan, I'm failing to take accountability. Indeed, what did I even do if I outsourced my value to AI?

To wrap up

Documentation and quality engineering are tightly linked. As outlined above, documents are one of the top artifacts produced by a quality engineer. As long as our goal is to achieve quality in a product, documentation will need to be written. 

Documentation answers the "why." Why are we using this? Why are we doing this? Why did this happen?

In this way, documentation bridges past and present. If we take a risk-based view of quality, documentation serves as a record for key decisions, trade-offs, and assumptions that may later become critical risks. By remembering these past decisions we can better mitigate today's risks, or explain why there is risk at all. 

Documentation points toward the future. By understanding the "why" of yesterday, we can infer how to improve tomorrow. Old constraints can disappear with the use of new technology. New ideas free old ways of thinking without rehashing old mistakes.

Documentation helps you see with clarity. In a world where people don't read, someone needs to write. 

What do YOU think?

Got comments or thoughts? Share them in the comments box below. If you like, use the ideas below as starting points for reflection and discussion.

Questions to discuss

  • Have you encountered a situation where someone failed to read a document? 
  • Have you encountered incidents that resulted from lack of documentation? 
  • Some of the most successful and most-read types of documentation are setup and how-to guides, which are not discussed here. What makes them different?

Actions to take

  • Review your documents. What ancient wisdom have you forgotten? 
  • Check your goal. Who did you write this document for? What was the goal? Did it accomplish the goal?
  • Ask for feedback. Do people take the time to provide a critical review? 
  • Ask AI for summaries. Is it accurate? Is it helpful? Does it provide context?
  • Write a helpful document. Ask AI to write the same document. Which is more useful?

For more information

Shawn Vernier
Quality Engineer
He/Him

The answer to quality is context.

Comments
Sign in to comment
Explore MoT
Don’t automate everything, review everything image
Software Testing Live: Episode 06
MoT Software Testing Essentials Certificate image
Boost your career in software testing with the MoT Software Testing Essentials Certificate. Learn essential skills, from basic testing techniques to advanced risk analysis, crafted by industry experts.
This Week in Quality image
Debrief the week in Quality via a community radio show hosted by Simon Tomes and members of the community
Subscribe to our newsletter
We'll keep you up to date on all the testing trends.