How to write automation that represents issues PMs would care about fixing
26 Mar 2026
This document is provided in the commentary
AGENTS.md
Titles must describe the expected user behaviour, written from the user's perspective:test('As a user, I can create a new project', ...) test('As a user, I can delete an existing client', ...) test('As a user, I cannot submit the form without a required field', ...)One Assertion Per Test Each test asserts one behaviour. If you find yourself writing multiple expect() calls that each test a different thing, split them into separate tests. Custom Expect Messages All assertions must include a descriptive failure message as the first argument:expect(response.status(), 'Expected 201 Created when creating a new project').toBe(201); expect(heading, 'Page heading should confirm project was saved').toBeVisible();No Conditionals Tests must not branch. No if, switch, or ternary expressions in test bodies. If setup differs between cases, use separate tests or test.beforeEach. No Magic Variables All values must be named. Use constants, faker, or values from the .env config:
// Good const testProjectName = faker.commerce.productName(); await projectPage.fillName(testProjectName); // Bad await page.fill('input', 'abc123');
Emily O'Connor
Principal Quality Engineer
She/Her
Technical leader and QE with a sixth sense for bugs. Avid learner and reader interested in decoding “dev-speak” to enable engineering teams to adopt automation and AI accelerated quality engineering. I believe good software starts with user-focused problem solving and that automation should provide information on regression bugs that PMs actually care about fixing.
Sign in
to comment
Catch the on-demand session with gaming legend John Romero and see how we’re redefining software quality at AI speed.
Explore MoT
Thu, 1 Oct
Previously known as TestBash, MoTaCon is the new name for our annual conference. It's where quality people gather.
Unlock the essential skills to transition into Test Automation through interactive, community-driven learning, backed by industry expertise
Into the MoTaverse is a podcast by Ministry of Testing, hosted by Rosie Sherry, exploring the people, insights, and systems shaping quality in modern software teams.