The Hypocrisy of Hypotheses (Or, How do we test hypothesis driven acceptance criteria) - Sharon McGee

13th January 2023
  • Locked
Sharon's profile
Sharon

The Hypocrisy of Hypotheses (Or, How do we test hypothesis driven acceptance criteria) - Sharon McGee image
Talk Description

Hypothesis driven development (HDD) helps ensure that product design results in business value. Framing product features as hypotheses and conducting mini experiments allows us to assess whether they will deliver pre-stated measurable business goals. Future product direction can then be informed by the results of our experiments. Borrowed from Thoughtworks, here is an example…

We Believe that increasing the size of hotel images on the booking page Will Result In improved customer engagement and conversion. We Will Know We Have Succeeded when we see a 5% increase in customers who review hotel images and then proceed to book in 48 hours

So we change the size of the image, deploy, test and observe that we have a 5% increase in customers who proceeded to book within 48 hours. We passed the acceptance criteria. Our hypothesis was correct and we can conclude that changing the size of the image increased sales. Right?

Maybe

This talk will examine the reasons why the answer to that question can only ever be maybe. Through example, we will explore the difference between this approach and empirical scientific methods where there are no stated acceptance criteria, and the motivation for the experiment is to falsify the hypothesis. We will walk though scenarios and discover some of the potential pitfalls of expressing requirements in this way. Together, we can identify types of acceptance criteria that would require more confidence in our test results. We can then discover how a more rigorous approach can help us ensure that the results of our tests mean what we think they mean.

Join the discussion about TestBash Belfast over at The Club

What you’ll learn

By the end of this talk, you'll be able to:

  • TBA
Sharon's profile'

Sharon

I am a software analyst who has recently returned to work after a period of time during which I did lots of other interesting things! These include looking after my children and completing an empirically based PhD on the causes and consequences of software requirements change. I enjoy trying to figure out what makes people tick, why software works and how music is wonderful.

Suggested Content
The Art of Asking Questions – Karen Johnson
3 Tips To Help Testers Ask Better Questions
What's that Smell? Tidying Up Our Test Code - Angie Jones
Five Optimization And Performance Tools That Enhance Website User Experience
Using Data to Model User Behaviour
A Software Tester's Guide To Usability
The 3rd episode of The Testing Planet, The Toolsmith will be all about, yes you guessed it: tools 😉
Explore MoT
Episode Three: The Toolsmith
A free monthly virtual software testing community gathering
Cognitive Biases In Software Testing
Learn how to recognise cognitive biases, explain what they are and use them to your advantage in your testing

Tags

  • analysis
  • user-experience