Customer experience surveys are widely used in business today. They are widely regarded as a great way to get invaluable feedback from customers.
But whether you’ve been running a customer survey for years, or whether you’ve never run one, it’s worth reflecting on the benefits they can offer. And it’s also worth considering how to go about getting the most from them.
Why run a customer experience survey?
Quite simply happy customers are loyal customers and loyal customers deliver more recommendations and more repeat business.
A well-designed customer satisfaction survey will deliver several important business benefits:
- By monitoring customer opinion, you’ll have early warning of any potential problems that might cause you to lose revenue further down the line.
- It will tell you which improvements will do the most to boost customer loyalty and sales.
- It can also tell you what improvements/changes are unlikely to deliver much value.
- In short, it prioritises what you need to do to nurture a loyal customer base.
So, when a customer survey is well designed and used effectively, it serves as a positive vehicle for tangible business improvements.
However, it is possible to get it wrong. And if this happens you might end up with something that delivers limited actionable insight.
So, how do you ensure your customer survey falls into the former category rather than the latter? Here’s a list of pointers I’ve pulled together, based on over 30 years’ experience of designing and running such programs:
Make sure you set the right objectives to start with
Let’s start at the very beginning by looking at the overall business objectives for a customer survey. Get that wrong and you’ll be lucky to get any real value from the exercise.
The most common reason why such surveys can fail is when they’ve been designed as a box ticking exercise from the very start. If a survey is just used to provide internal reassurance that all is well, it isn’t ever going to serve as an effective agent for change.
Fortunately, this kind of problem is rare. A more common issue is that sometimes these surveys can be used exclusively in a limited, tactical, way. Here performance scores for each area of the business might be used to directly inform such things as bonuses and performance reviews. That’s all fine but if this is the only tangible way in which the survey is used, it’s a missed opportunity.
Don’t get me wrong, there’s a value in using surveys to tactically monitor business performance. But their true value lies in using them to guide business improvement at a strategic level. If we lose sight of this goal, we won’t ever get the most out of such surveys.
Takeaway: The primary goal of a customer experience survey should move beyond monitoring performance to providing direction for business improvement.
Using standard templates is just a starting point
Moving on to how we go about designing a survey, the next trap people can fall into is taking the easy option. By that, I mean running a generic customer survey based on standard templates.
It is easy enough to find standard templates for customer experience surveys online. Most DIY survey tools will provide them as part of the service. Standard questions for such things as NPS and generic performance scoring are readily available.
But standard templates are, as the name implies, standard. They are designed to serve as a handy starting point – not an end in themselves.
There is nothing in any of them that will be unique or specific to any business. As a result, if you rely purely on a standard template, you’ll only get generic feedback.
That might be helpful up to a point, but to receive specific, actionable, insight from a survey you need to tailor it to collect specific, actionable, feedback from your customers. And that means you need to ask questions about issues that are specific to your business, not any business.
Takeaway: Only ever use standard templates as a starting point. Always tailor customer experience surveys to the specific needs of your business.
Avoid vague measures, focus on actionable ones
It may sound obvious, but it’s important to make sure you are measuring aspects of business performance that are clearly defined and meaningful. That means it needs to be specific, so there is no confusion over what it might or might not mean when you come to look at the results.
Leaving these definitions too broad or too generic can make it very hard to interpret the feedback you get.
Let’s take an example – ‘quality’. What exactly does that mean? It might mean slightly different things in different industries. And it might mean different things to different people, even within the same organisation.
If your product is machinery, product quality could refer to reliability and its ability to run with minimal downtime. However, it might also relate to the quality of work the machine produces. Or perhaps, under certain circumstances, it might refer more to accuracy and precision? When you think about it, ‘quality’ could encompass a range of different things.
To avoid potentially confusing outcomes of this sort you need to use more specific phrasing. That way, when you identify an area that needs improvement, it’s clear what needs to be done.
Takeaway: Ensure you’re testing specific measures of business performance.
Always Provide a mechanism for open feedback
Not everyone will answer open-ended questions by any means. Indeed, surveys can fall into the trap of asking too many, leading to a poor response.
However, one or two well targeted open questions will provide invaluable feedback. It is a golden opportunity to pick up on issues and opportunities for improvement that you haven’t thought of, but which your customers have!
Takeaway: Always include one or two well targeted open questions to elicit feedback from customers. But don’t add too many or response rates will suffer, and the quality of answers will be diluted.
Ensuring insight is actionable
Of course, you might already have a customer experience survey. Perhaps it has been running for years. If it is delivering good value then happy days. However, that’s not always the case.
Sometimes people find that the outputs from an existing customer experience survey are not particularly actionable. If that is the case, then it’s a clear warning sign you’re doing something wrong.
There are only two reasons why this ever happens:
1st reason: Senior management in the business are incapable of driving positive change, even if they are provided with clear direction as to what they should be doing.
2nd reason: The survey was poorly designed in the first place and is unlikely to ever deliver anything actionable.
Unfortunately, the first of these problems can’t be solved by a survey or any other form of market insight come to that! But it is possible to do something about the latter!
The answer is simple – you need to redesign your customer experience survey. Don’t keep re-running it and repeating the same old mistakes.
Takeaway: If your customer experience survey is not delivering actionable insight, stop running it. You need to either re-design it or save your money and not bother!
Legacy questions and survey bloat
Has your customer survey been running for several years now? Does the following pattern sound familiar?
- Every year, the previous year’s questionnaire gets circulated to survey stakeholders for feedback.
- Each stakeholder comes back with feedback that involves adding new questions, but they don’t often suggest taking any of the old questions away.
- Some of the new questions (perhaps all) relate to some very specific departmental initiatives.
- The questionnaire gets longer.
- The response rate goes down as a result.
- A year goes by and it may not be entirely clear what has been done with the outputs of some of these questions.
- The process repeats itself….
Of course, there is a benefit in maintaining consistency. However, there’s little point measuring things that are no longer relevant for the business.
It may well be time for a more fundamental review.
Maybe even consider going back to square one and running some qualitative research with customers. Could you be missing something vitally important that a few open conversations with customers could reveal?
Alternatively, maybe you need to run some internal workshops. How well do current priorities really align with legacy questions in the survey?
Takeaway: If you think your customer survey has become overly bloated with legacy questions, don’t shy away from carrying out a full review.
Synchronix Research offers a full range of market research services, polling services and market research training. We can also provide technical content writing services & content writing services in relation to survey reporting and thought leadership.
For any questions or enquiries, please email us: firstname.lastname@example.org
You can read more about us on our website.
You can catch up with our past blog articles here.