Can hiring software testers actually make things worse for your organization?
I saw a LinkedIn post recently that got me thinking. A recruiter shared a conversation with a startup CEO who argued that software testers “cause more harm than good” when you factor in cost versus benefit. Long story short: he said that if you hire strong developers with quality mindsets, you don't need people in dedicated quality roles. And honestly? That’s not entirely wrong...
Now, before I lose my MoT membership card, I want to be clear: I’m not saying that quality work doesn't matter. It absolutely does! However, some teams simply don't need dedicated quality hires. If you're early-stage, have a simple product, and your engineers are genuinely quality-minded, adding a dedicated tester might just slow you down.
But I've also seen the flip side. I've been a solo tester three times now (which I wrote about here), and I've helped teams make similar hires from the other side. For example, a few years ago, a friend asked me to help his startup with their quality challenges. “Honestly, if we could just get our regression tests automated, our releases would be so much smoother,” he told me as he showed me around their small office space, rented from a coworking company.
His company ultimately didn't make it. (Most startups don't survive, even those with no quality issues.) But that conversation stands out to me now because it’s such a perfect example of what I've seen over and over: teams believe they need to hire someone to solve "a testing problem" when the real problem is something quite different. And in the end, because they hired someone to solve a symptom rather than a cause, teams conclude that "software testing doesn't work."
Let me explain.
What problem are you really trying to solve by hiring software testers?
Testers find problems and talk about them. Are you ready to hear the truth?
Most teams hire software testers reactively. Something is wrong: releases feel risky, rollbacks are common, customer complaints are piling up. So they decide it's time to get someone to “handle quality.” But "handle quality" isn't a job description. It's a wish.
Here's what I wish every hiring manager understood: when you bring in your first software tester, you're hiring someone whose job is to notice everything that's systematically broken about how you build software. Unfortunately, when the person who's doing the hiring isn't familiar with what quality work actually entails, it's easy to misunderstand what you're signing up for.
Root-causing the real problems at your organization
My friend at the startup thought he needed better test automation. What he actually needed was someone to help the team step back and ask hard questions about what they were building and how. But even if he had realized this, it's unlikely that he would have posted a job description for "help us figure out what's actually worth testing and why.”
I've seen this play out in two different ways:
1: You hire to fix a symptom, not the underlying problem
Your test suite is slow and flaky, so you hire someone to “fix test automation.” Or bugs keep slipping through, so you hire someone to “do more testing.” But if your real problem is that you don't have a clear testing strategy, or your requirements are unclear, or your definition of "done" varies by team, then hiring someone to execute more testing won't solve it.
That's exactly what happened with my friend's startup. They had test automation problems, sure. But the deeper issues were unclear requirements and constantly shifting priorities. No number of "better" test scripts was going to fix those fundamental challenges.
2: You hire the right person but have the wrong expectations
You find someone experienced who does exactly what good quality work looks like. Great! Then your troubles begin. They'll want to know who decided something was "good enough" and whether anyone validated that assumption. They're going to ask why certain features exist. They'll bring to light gaps in your requirements, inconsistencies in your processes, and risks you didn't know you were taking.
This truly is what they should be doing and what you should want them to do! But that's also where it goes sideways: you hired them to "own quality" but didn't give them any influence over the decisions that affect quality. So they end up doing damage control instead of prevention. They catch some bugs and point out systemic issues, but they can't address the root causes creating those problems in the first place.
And the team that expected easy quality wins now that there’s actually someone dedicated to testing and bug catching? Instead they get someone who points to organizational problems they have no power to fix. Everyone gets frustrated. And then CEOs talk to their recruiters about how hiring software testers doesn’t work.
What successful teams do different: Change their way of thinking before they hire a software tester
In contrast to the teams I described above, some teams get a lot of value from their first software testing hire. They understand that adding a tester to the team means bringing in someone who will shift your team's approach to building software, not simply make your current broken process work better.
Now, I want to be clear: of course your software testing hire will do testing work, like designing test strategies, writing test cases, finding bugs, building automation frameworks, and all that tactical stuff you're expecting. But the teams that get the most value from their testers understand that these tactical activities are in service of a bigger goal: changing how the entire team thinks about and delivers quality.
They start with finding the source of the pain
Instead of deciding "we need software testers", these teams get specific about what's actually broken. This is what my founder friend should have done and what I tried to help him do once I realized that test automation wasn't actually the problem.
I spent time talking to everyone on the team and found much deeper issues: inconsistent alignment with respect to constantly changing requirements, no shared definition of "done," and releases that were scary because they had no reliable way to validate that features actually worked before going live. The test automation issues were just a symptom of bigger process and communication gaps.
They don't try to hire a unicorn
Instead of posting a job that asks for someone who can do test strategy AND build automation AND exploratory testing AND coach the team, AND AND AND, teams that succeed in bringing testers aboard pick the one or two things that matter most right now. They understand that having one person alone doing all the "quality" work leads either to burnout or gaps in the areas they thought were covered.
They're honest about organizational readiness to hire testers
Good quality work puts a spotlight on uncomfortable truths. Your new hire is going to point out all sorts of things that, without your setting proper expectations upfront, might feel like undue criticism.
Teams who benefit from this feedback are the ones where the hiring manager has prepared everyone to see quality insights as organizational learning opportunities. And what's more, the team is open to making those changes.
What quickly became clear to me at the startup was that the team wasn't ready for the kind of organizational changes that would actually solve their quality problems. The notes from my discussions with that team say things like "top-down mentality that it's ok to not focus on quality" and "pressure from leadership to do things fast at the cost of doing things right.”
And I understand how they got there. The pressure to meet customer deadlines while also responding to shifting requirements after commitments were made was real. Some quality challenges need actual commitment from leadership in order to fix. And if you're reading this, sorry friend! I write this all with love.
They allow their testers to influence important decisions
This might be the most important factor in successful tester hires. If your hire can’t influence requirements, priorities, or technical decisions, they won’t actually be able to move the needle on quality.
Before I wrapped up my time consulting with my startup friend, he asked whether he should hire someone full time. My response was that someone could definitely help with the tactical stuff: CI / CD pipeline, tightening release processes, building some test automation. But I told him honestly that until he was ready to address the shifting requirements and the pressure to skip quality for speed, even the best quality hire would just end up frustrated.
When you should NOT hire a software tester
Another uncomfortable truth? Sometimes that CEO's "pessimistic" cost-benefit analysis is spot on.
If your organization has a small team of quality-minded engineers with simple products and tight feedback loops with users, a dedicated quality hire might be overkill. You might be better off investing in better dev practices or tooling. You could also consider bringing on a fractional testing leader for a short time to help you build that foundation. But if you have a larger team or a complex product, then dedicated quality expertise is probably worth the investment.
To wrap up
That startup CEO wasn't wrong about software testers sometimes causing more harm than good. However, he may inadvertently have revealed that he himself hired reactively, had unclear expectations, and didn't give software testers real authority to improve things.
But when you hire thoughtfully and bring someone in to improve how you build software, not just find what's broken? That’s when quality work becomes a force multiplier instead of a bottleneck.
For more information
- What’s a red flag for hiring a software tester?, Rosie Sherry
- 10 Misguided Reasons Not To Hire Testers, Kate Paulk
- Onboarding testers: Growing your new hire, AJ Larson
- The Lone Tester: Surviving and Thriving as The First QA Hire, Susanne Abdelrahman