Reading:
How Ministry of Testing Started My Performance Testing Journey
Share:

How Ministry of Testing Started My Performance Testing Journey

Got a few spare minutes? Come and read Ali Hill's Ministry of Testing Scholarship story

Content in review

By Ali Hill

For the month of July, I participated in the Ministry of Testing’s ‘30 Days of Performance Testing’ challenge. As part of the Ministry of Testing Scholarship, I was very kindly presented with a ticket to Joe Colantonio’s PerfGuild conference.

First of all, I’d like to provide you with a bit of a background about myself and how I got into testing. As a History graduate, I do not come from a technical background. Like so many others, I ‘fell’ into testing. In my case, I became a Games Tester which gradually developed into me wanting to learn more about software testing practices. I soon discovered the software testing community and, of course, Ministry of Testing. It was after discovering the community that I decided I wanted to do software testing as a career.

In my current role, I am a manual tester by title but have recently developed an interest in all things DevOps. Having read ‘The DevOps Handbook’ and ‘The Phoenix Project’ I became really interested in the monitoring of software and in particular web applications. I’m also in the process of becoming more automation focused to aid our continuous delivery pipeline and to help develop test tools for my fellow team members. It was these interests which sparked my interest in performance testing and were the reason I decided to start the ‘30 Days of Performance Testing’ challenge.

It’s not an exaggeration to say that this challenge has opened my eyes to a whole new world of testing and software development. Performance testing is something which I’d flirted with on an ad-hoc basis, but nothing I’ve ever fully got involved with.

The great thing about the 30 Days of Testing challenge is that it is in no way intimidating. Sometimes when you decide you want to learn something new, you can become overwhelmed with information. One of the benefits of the 30 Days Challenge is that you learn a small piece of information or carry out a small task each day. I found this to be an extremely productive way to learn as I wasn’t overwhelming myself with too much information.

At the end of my challenge, I now feel like I understand the basics of performance testing by taking these bite-size pieces of information and piecing them together as I ticked the days off. I’ve also developed a passion for all things performance and now feel like this is the direction I want my testing to go in.

In this post, I’d like to give an overview of three things I learned throughout my ‘30 Days’ challenge.

Best Practices in Performance Testing

It was ‘Day 2 - Listen to a performance testing podcast’ which alerted me to the potential issue that, as a beginner to performance testing, the stats I could be providing the stakeholders had the potential to be misleading. It was a TestTalks podcast with Joe Colantonio and performance testing expert Rebecca Clinard which highlighted the importance of suitable metrics to me. Rebecca’s last words of wisdom on the episode were ‘do not be in a rush’ whilst learning to performance test and report results.

I’d been an advocate for more visible application performance in our workplace since before the challenge began but it was this challenge which triggered me to learn exactly which metrics would be suitable to display on my planned dashboard. An example of this is, prior to the challenge, I may have displayed ‘slowest API response’ which could be misleading. It also introduced me to percentiles which I will now be displaying on my dashboard. This gives us much more information than displaying the single slowest response from our production APIs.

This is just one of many examples of a daily task triggering me to investigate a topic in more depth. Hearing a comment on a podcast about reporting suitable metrics made me question what I knew about reporting results from my performance tests and I did some further reading on the topic. The ’30 Days’ challenge gave me the motivation and direction to carry this out.

Tool Evaluation

One of the main reasons I took up the challenge was because I felt that I lacked the proper tools to carry out performance testing and monitoring. My main goal at the beginning of this challenge was to evaluate some load testing and application performance management (APM) tools and feed these back to my managers.

I often find when learning a new skill that it’s easy to feel like you’re drowning in tool choices. I hear people give advice such as ‘decide what it is you want to do and then pick the right tool for you.’ Although people are 100% correct to say that, how can you begin to evaluate a tool when you’re not completely confident in the area that you’re exploring?

This is where I felt the ‘30 Days’ challenge really helped me. As I began to understand the domain, I became more aware of the advantages and disadvantage of the tools I had heard so much about. It was more often than not when I was evaluating a tool that I found myself saying ‘hold on, I wonder if I can do that in a tool we already have at work.’ In most cases, it turned out I could.

As the challenge went on, I realised that I was lucky that I had all of the tools I required at my disposal. From scripting web performance tests to application monitoring, we had them all. I can actually pinpoint the day that this realisation hit me; ‘Day 22 – Try an Online Performance Testing Tool.’ I still further explored a tool I use in my current role– but this highlights one of the beauties of the challenge – you can take each daily challenge and personalise it.

For those of you who aren’t as lucky as I am to already have tools available during the challenge, I’d encourage you to take a look at open-source tools. If you look at some of the blog/Twitter posts (search the hashtag #30daysoftesting) for days 14 and 22 in particular, you may find a tool applicable to you.

One of the things which surprised me is that most performance testing tools mentioned in the challenge are open source. It’s a lot easier to convince management to implement a new idea or process when it’s not going to cost any more money. It’s not a great idea to focus on open source exclusively, but a lot of paid for tools offer free trials which give you enough time to evaluate and present a case to your boss.

Helped me Engage with the Software Testing Community

It has been just over a year now since I started following fellow software testers and developers on Twitter. I’d replied to the odd person here and there and liked a few Tweets but I was mainly a reader and not someone who engaged too often.

I’d always wanted to become involved in the community and ‘30 Days of Performance Testing’ turned out to be a very easy way to do just that. The amount of interaction I had with the community, especially those also taking part in the challenge, really helped motivate me throughout.

I feel like I learned just as much from reading other people’s blogs and posts as I did through my personal journey. It was really nice to be a part of and I received some really useful feedback from various people who had read my posts.

Of course, the main reason I took part in the challenge was to learn about performance testing, this was just a really nice side-effect. I’d recommend to anyone thinking of doing any 30 Days of Testing challenge to blog your experiences. It adds an extra dynamic to the challenge, and one I found extremely rewarding. Some of my posts were short and some a bit longer, but the feedback I received on some of my posts was extremely valuable.

Conclusion

To conclude – the ’30 Days of Performance Testing Challenge’ allowed me to learn a new area of software testing by carrying out small daily challenges. I owe a lot of thanks to the Ministry of Testing Scholarship. At the time of writing, I’m not even halfway through the PerfGuild conference content but I have already picked up so many pointers from some of the excellent speakers at the conference.

I’m now at a point where I can safely say I’ve developed a new passion and want to become an effective performance tester. This would not have happened without the ’30 Days’ challenge. Ministry of Testing has truly helped me take my performance testing to the next level.

Author Bio

Ali Hill has been a software tester for over three and a half years. Starting off his career as a games tester at Rockstar North, he then moved into a more traditional software testing role at Craneware, testing a web application produced for the U.S. healthcare industry.

Ali graduated with a History degree from the University of Edinburgh but has since developed a passion for software development and is addicted to learning with an interest in all things DevOps, automation and now, thanks to the ‘30 Days of Performance Testing Challenge’, performance testing.

Outside of work Ali is interested in football, watching films and TV and playing video games.

You can find him on Twitter and LinkedIn.

A Guide to Usability
Performance Testing 101 (TestBash UK 2022)
What is Performance Testing?
Selenium 4 introduces relative locators. This new feature allows the user to locate an object in relation to another object on the screen! Don't wait, get an instant demo today.
Explore MoT
Episode One: The Companion
A free monthly virtual software testing community gathering
Performance Testing 101 - Simon Knight
Get started with performance testing and JMeter