Reading:
Being On A Support Call
Share:

Being On A Support Call

When Support Calls: Helping Your Support Team To Help Your Customers: Article 3 of 4

In this series, we’re exploring what you can do when you’ve been asked to sit in on a support call with a customer. Because we have a deep and long background in testing, we come at this from a testing perspective, but we think our advice holds good for other technical staff members too.

The first article described support calls and gave tips for navigating them successfully, while the second was about the kinds of preparation to consider before the call begins. In this one, we’re talking about the main event, the call itself.

Your Skills

If you can keep your head when all about you, customers and colleagues are losing theirs and blaming it on you then, with apologies to Kipling, you'll stand a decent chance of being comfortable with customer support.

You were asked onto the call because you have something which tech support believes could help them to help a customer. Testers might be brought in for advice, for offline investigation, for triage, or to test fixes for issues that start as support tickets.

Testing for a support call, unlike much day-to-day work, comes with particular piquancy, urgency and pressure to the diagnostic task. You may well be responsible for a hard-to-reproduce, high-value issue, without a buffer between you and the customer who is experiencing the pain.

Furthermore, support is often more constrained than the average test mission. In the support scenario, you know there's an issue and you might even know some of the symptoms. However, because you are interacting with the problem remotely, via a customer, you usually have a distorted lens through which to view it, a delay on your interactions with it, restrictions on the questions you can reasonably ask and an understandably limited supply of patience on the part of the customer who frankly just wants the thing to work.

Another consideration is whether to put time into looking for a diagnosis, a solution, or a workaround. Prioritisation and timeliness are key, along with strong investigative skills, pragmatism, sheer bloody-mindedness, critical thinking, and the ability to view a problem from multiple perspectives.

We think there are often very deep overlaps between technical support, particularly when done in front of a customer, and exploratory testing. Techniques, heuristics and approaches such as these can provide options to try:

To get the right results at the right time at the right cost you have to be prepared to iterate, to revisit, to expand and contract your search, to investigate all available resources (people, documentation, code inspection) and assimilate them.

Remember Whose Call It Is

On rare occasions, as a tester, we have done solo support. In our experience, you’re much more likely to be on a call to assist someone from the support team. Even if you are the expert in the relevant part of the product, even if you have more experience than your colleague, even if you have lots of other things to be getting on with, stay focussed on the mission: help your colleague to help your customer in the way you agreed previously.

You will be on a support call with a real customer and someone from your side who has likely invested significant time building up a relationship, and trust, with the customer and whose job is dealing with that customer. Although your mission is primary, there's also a chance for you to start building a relationship, and trust, with the customer for yourself. Depending on the way the issue pans out you might be talking to the customer again later.

Gather Data

There’s a lot you can get from customer interactions, quite apart from making a customer’s day that little bit better. The good news is that you can do this by simply observing everything that’s happening in front of you carefully. You might learn something which is peripheral to the problem at hand, but still incredibly valuable, such as:

  • How some users interact with your software.
  • What some users expect from your software.
  • How someone in your company deals with customers.
  • What some of your users expect from your company and its products.
  • What kinds of environments some of your customers are using.
  • The assumptions your test team can sometimes make.

The following are examples of things you might uncover during this process:

  • The customer doesn’t use context menus, only buttons and menu bars so they miss 50% of your product functionality.
  • The user doesn’t know the unique selling point of your product, nor the extent of the features it offers.
  • Your support staff have to put up with an awful lot of crap.
  • Your support staff never give the impression that they wish the customer would walk into the sea.
  • Your support staff lack written protocols or standard answers to help facilitate their work.
  • Your support staff have things to share about how to diagnose complex issues under pressure.
  • Your support staff could do with more product knowledge.
  • Your company doesn’t provide much product training for your support staff.
  • Your support staff have a lot of ground truth that you don’t know and could be useful for developers and testers alike.
  • You really could do with full-time support staff.
  • Your customers appear to expect a higher level of support than they're entitled to.
  • Your customers run an OS that you don’t support.
  • Your customers run an OS that the manufacturers no longer support.
  • Your customers run a non-standard theme, an unusual resolution, and their desktop is littered with icons.
  • Your customer doesn’t realise that there’s a button to search the manual.
  • You never knew that the fonts used by your product look terrible when screen sharing.
  • Your customer is comparing your product to another and expecting the same functionality.

Stay Alert

The call has started. You’re on the call and you’re letting your support staff set the scene, explain what the call is about, introduce everyone to everyone else and so on.

Don’t zone out.

You are representing your company to the customer, so it’s important that you stay engaged. Really listen and be prepared to answer when called upon or be more actively involved if that’s your role on this call. Turn off any potential distractions such as instant messages, emails, or Twitter and, if you can, shut your laptop down altogether.

Depending on the format of the call, you may get a summary of the issue, or a demo of it, or be dropped straight into the detritus of a failure with consoles and log files open, or a machine wedged or a device exhibiting non-deterministic behaviour. To give yourself an increased chance of assisting your colleague and your customer, pay careful attention to everything you are shown.

Build A Shared Understanding

There’s usually time and goodwill enough at the start of a support call to establish what the point of the call is. The preparation you did can help you here too, and you'll have some idea of the pieces of the puzzle that are currently unknown or unclear. Take the ones that you think are likely to be most important and ask about them now.

It’s a good idea to have the issue stated in clear terms so that everyone is aware of what the perceived problem is and what the desired outcomes of the call are. This may be something that your customer will naturally do, but if not then it’s usually worth taking on yourself.

To build confidence in your customer interactions, you might ask your tech support colleague on the call to run through a list of questions up front. This could give you confidence that the customer is receptive and takes your thoughts seriously.

Once a mutual understanding of the problem is reached, the real problem-solving work can begin. It’s better to get it right early on than spend a couple of hours finding out that you started in different places.

It’s worth remembering that your customer is not necessarily a reliable witness. Just like anyone else they will be subject to biases, assumptions about what can be taken for granted, filters on what information is relevant to the problem, and open to reporting their interpretation of what happened over what actually happened. This doesn’t mean that you should belittle or discredit their testimony, but you should apply your critical thinking to it as you would any other data.

Having the customer demonstrate the issue to you can be exceedingly valuable and could provide that aha! moment you need to solve the issue. Be mindful of where the customer is in your support process. If they have already shown the issue to a great number of people, asking for a demo of the issue again might not be so helpful. Joel Spolsky suggests:

Many requests for a customer to check something can be phrased this way. Instead of telling them to check a setting, tell them to change the setting and then change it back “just to make sure that the software writes out its settings.”

And if you feel the need to ask them to retry something, make sure you have a good reason.

Remember too that they may not use your terminology reliably. In fact, they might not use your terminology at all. For example, when a customer says “it crashes” they might mean that there’s an error message, there’s a warning, the operation never completes, the operation completes but gives no feedback that it’s done so, the application performs the expected action but pops up a dialog that the user can’t understand, the application has no functional issue that they're aware of but dumps stack trace into the log, or numerous other things, including that the thing actually crashes.

You may need to calibrate your understanding of terms for the customer you’re speaking to, and if there are multiple people from the customer on the call, you may need to translate for them. Think carefully before correcting the customer’s terminology while on a call. Is it really important to debate the meaning of a word right now?

The reverse can also be true: the customer may well know more than you in some respects. So be humble and upfront about the limits of your knowledge. In fact, offering them the opportunity to display their knowledge can be a useful tactic in establishing trust. The same benefit can come from phrasing questions in a way that makes it clear you are seeking to understand their problem, or problem space, better. Starting a question with “can you help me to understand …” is a great example of this.

Diagnosis

You might not use the same tactics during a call as you would while testing in your day job. Time is almost always of the essence on support calls, so an approach which painstakingly enumerates permutations and grinds through them will generally not be appropriate. If getting to an understanding of the problem requires a lengthy approach, you should ask to do it offline and offer for your company to do as much of it as possible, e.g. by scripting it, or providing test data or whatever else makes sense.

You may not need to solve the problem on the call, you might only need to direct the next step of the investigation to the right area. Tactics which can take a big picture view and focus in on anomalies are likely to be more productive in the short term. For example, change 50% of the variables and see whether that has an effect to try to rule out multiple possibilities in one step. By ruling out avenues that you are pretty certain have nothing to do with the problem, not only are you simplifying the problem, but also giving the customer an idea that progress is being made.

Diagnosing an issue in front of a customer is good practice at framing a hypothesis and an experiment to test it. You probably do this informally all the time when testing. Support gives you a chance to see how well you can verbalise and communicate it.

While you’re doing it, take care to keep the customer involved in the thought process. For example: “We’d like to try changing five of the ten options now to see whether the outcome changes. If it does we’ll dig into those five in more detail. If not, we’ll try the other five. Of course, there may be some kind of dependency between options, but let’s try the simple experiment first.”

Explaining what you are doing and why creates a sense of trust between you and the customer. It allows the customer to gain a little more knowledge about the inner workings of your product. They may even value the product more now that they know how it works, what was broken and how to fix it.

Don’t be afraid to offer the customer their time back: “This experiment might take ten minutes. Would you like to get a coffee, or can we call you back?”

Wrapping Up

How long should a support call take?

It should be as short as possible, but no shorter. As is the nature of these open-ended tasks, what’s possible is not necessarily known or easy to predict. A call might not end with a solution but instead, generate a new set of actions. Sometimes you’ll know that that’s the intention, in which case identifying a suitable end-point is relatively straightforward. Sometimes there’ll be a time box for the call and then it just has to stop. When the call is constrained by time in advance make sure that you are monitoring the time and giving clear indications about how long is left during the call. Sometimes it’ll be apparent that the session is going nowhere.

Whatever the circumstances, though, someone will need to decide to bring the call to a close. The next article will address how to close out a call and suggested follow up steps for you and those involved.

Key Takeaways

  • Stay humble; your role is to assist.
  • Watch and listen carefully.
  • Try to establish a common understanding of the perceived problem.
  • Your prep provides a checklist of assumptions and things to try.
  • Your prep provides questions for the customer.
  • Your prep provides comparisons — how does your experience differ from what the customer is seeing?
  • Look for efficient routes to diagnosis
  • ... or just a next step.
  • Explain to the customer what you’re doing
  • … and why.
James Thomas's profile
James Thomas

Quality Engineer

I'm Vice President of the Association for Software Testing, a non-profit organisation dedicated to the advancement of the testing craft. Over the years I've had many roles including developer, technical author, technical support, and manager, but the combination of intellectual, practical, and social challenges in testing are what really excite me. I blogged about my Test.bash() 2022 API Automation Challenge entry in https://qahiccupps.blogspot.com/2022/10/having-testblast.html

Neil Younger's profile
Neil Younger

Neil Younger has been helping to build successful products and happy teams in the UK for the last 20 years. From working with desktop applications to databases, banking to browsers, security to silicon, Neil continues to shape his craft and put people first.

Chris George's profile
Chris George

Chris George has been a software tester and question asker since 1996 working for a variety of UK companies making tools for database development, data reporting and digital content broadcasting. During that time he has explored, investigated, innovated, invented, planned, automated, stressed, reported, loaded, coded and estimated on both traditional (waterfall) and agile software teams. He also presents at software conferences on testing topics and sometimes writes a blog at Mostly Testing.



United by Security : The Test that Divides Us - Jahmel Harris & Claire Reckless
Organizing Your Very Own Testing Meetup
Finding Testing Allies - Melissa Eaden
The Art of Asking Questions – Karen Johnson
Threat Modelling: How Software Survives in a Hacker’s Universe Saskia Coplans
The Automation Break Up: Saying Goodbye to Full Stack Tests with Task Analysis - Mark Winteringham
Explore MoT
30 Days of AI in Testing
Upgrade Your Testing Game with the 30 Days of AI in Testing Challenge!
Improving Your Testing Through Operability
By the end of the course, you'll have the tools you need to become an operability advocate. Making your testing even more awesome along the way!
  • Ash Winter's profile