Discussions: Reporting Your Exploratory Testing

Discussions: Reporting Your Exploratory Testing

Description:

In this session, Suman Bala will be joined by three guests, Niranjani Manoharan, Monica Arzani & Laveena Ramchandani where they will share their different experiences on how they report their Exploratory Testing.

They discuss  and answer the following questions:

  1. Have you changed the way you report your exploratory testing over the years?
  2. What would be the best time and place to report your exploratory testing?
  3. What's the biggest challenge you've encountered while reporting on your testing?
  4. What feedback did Developers, POs, Manager, Product Management give about which part was useful for them and how was it used when your present your test report?
  5. How do you record progress with Exploratory Testing? With scripted testing, we can give tests passed/failed, but this doesn't seem so easy with Exploratory.
  6. Are there any tools that you fund useful to report your findings with Exploratory Testing?
  7. What are your tips for effective reporting when Exploratory Testing?
  8. How do you measure test coverage with Exploratory Testing? Can traceability matrices still be used?
  9. Do you include things you didn't test in your reporting, and what do you do with that list?
  10. If you come across issues that you struggle to replicate consistently or things that aren't clear, how long do you spend investigating those before moving on?
  11. As a new tester who has never been asked to report on Exploratory Testing in a small company, what is the best way to start?
  12. Have you done exploratory testing sessions with developers or not 'testers'? Tips on how to help them do good note-taking/debriefing.

What our panelists have to say about themselves:

Niranjani Manoharan

With my increasing curiosity to learn, I wanted to use my exploratory testing skills to understand data analysis. Data analysis is defined as a process of cleaning, transforming and modelling data to discover meaningful information for business decision making. For the purpose of this study, I used the COVID-19 vaccination progress dataset.

To do data analysis, domain knowledge would be super helpful - for example, if you are not a wine connoisseur, then how would you know about the good or bad categories of wine? But in this case, I was using it as it was relevant to the pandemic situation we all are in!

While cleaning data, it is critical to know when to drop columns and rows. Determining missing values and replacing them with mean imputation is a common practice.

Laveena Ramchandani

Exploratory testing is quite an interesting area for me. It is a type of testing where we are not confirming the happy paths but also unveiling any unknowns, risks or even assumptions. The assumptions come into our minds when we do not know 100% how a piece of functionality works, which might lead to assumptions. By assuming we have added a major risk to the product! So, by exploring the product we can confirm what the product should and should not do, which in turn helps make better team decisions or even enhance existing testing patterns and advocate for repair.

Monica Arzani

I have been in testing for 5 years and a half. I started just after graduating and system testing has been my first job. I work in a traditional industry: we make anchors, drills and tools for construction companies. Testing in my business unit is very influenced by hardware testing protocols. We believed we could model every user and system behavior, by writing tons of specifications and testing according to them. As a newbie, I trusted the system and process in place, until I started realizing that we built ourselves a golden cage ad we would just repeat and optimize over and over what was already known.