How to maximise customer feedback? . | . What are industry leading response rates?


Response rates measure the proportion of people who respond as a percentage of the population who had an opportunity to respond. In respect of surveys and polls, we are concerned with what proportion of people who encounter a survey actually provide a response.
There is no universal standard for this. Whether people are likely to respond to a survey depends on a large number of factors:
  1. Whether they recognised the link to open the survey
  2. How they felt about responding to a survey
  3. How much time they felt the survey might consume
  4. How much effort (thinking) they felt would be involved
  5. Their expectation as to whether their input would be ‘heard’
  6. Their confidence that their efforts would ‘make a difference’
  7. Their trust in the motives of the surveyor
… among others, both personal and environmental.
A weakness in any one of these factors is sufficient cause for an individual to ignore the survey. Maximising responses is therefore a matter of getting these right for most people most of the time. The following principles will help you to do this.

Make your survey link easy to identify

It is very likely that your intended respondents are busy people. They rush from one task to another, and tend to skim past things until something catches their interest. For this reason, it is easy for survey links to be overlooked. Make your link clear but unobtrusive. Use a larger font, and different colours. Appropriate simple graphics can also help. Place it in a location where it is easy to see, and which they will encounter at a point when a survey might be expected.

Pose your question in an engaging way

Use a short and simple form of words that the respondent can readily understand. For instance ‘How did I do?’ is an obvious and engaging question at the end of a piece of service delivery. Linking the question to some simple, easy to grasp, options to answer the question makes it more likely that the reader will respond. Inspirometer uses a simple set of emoticons (faces) which enable people to quickly, almost subconsciously, identify their answer.

Make it quick, and make it clear its quick

People are wary of survey links. They all have some experience of being ‘involved’ in something longer and more complex than they wanted. Keep it short and simple. The main question to answer is ‘Was it good?’ – all other questions are secondary, and our focus on them can become an obstacle to people answering the primary question. Inspirometer’s approach is to pose one question which can be answered by a single click on one of six faces, right where they are. Make sure they know they can answer in just one click, and that anything more is entirely optional.

Ask when the answer is on their mind

The advantage of a single click system is that it has much greater potential for being co-located with the service to which it relates. This means that it can survey people at a point when their experience is fresh, and when they are immediately aware of the answer – so there is no additional thinking effort involved. Catching people ‘in the flow’ makes it far more likely that they will respond as part of that flow – particularly if people believe that their response will be valued (see the next two principles).

Explain what happens to their response

Nobody likes speaking into a vacuum. Even a single click is ‘too much’ if people doubt that anybody is listening. While many may simply assume that their response will be heard, others may be more skeptical, and we need to be ready to answer such concerns if they arise. A simple information link, explaining how their response will be used, can help allay such questions. The same is true for questions around anonymity – could a response adversely influence future service to the respondent. If the response is anonymous, make this clear to people.

Show them the difference it makes

While all of the above can generate a good response rate, it is not uncommon for the response rate to drop-off over time. This is usually because even a single click feels like a waste of time if it changes nothing. Finding ways to keep your respondents informed of what is changing as a result of their feedback can help to encourage them in the idea that their ‘clicks’ are adding value. This can be achieved by periodic communications to known communities of respondents, or by including a link like ‘As a result of your feedback …’ which shares the results, and what has changed – but you will need to keep this fresh.

Build a relationship of collaboration

Perhaps the most positive impact on response rates comes from actively engaging the respondent community (whether they be customers, employees, partners, or some other group) in the project as a partnership in ensuring they get the best service possible. Explain to them what you are trying to do and how their ideas and opinions fit into this. Position the feedback mechanism as a quick and easy way of capturing this over time. And answer any concerns they may have about it. Then set a target for what a healthy and productive relationship might look like in terms of scores and response rates.

What difference does the survey tool make?

Pilot_Response_LevelsIndividual survey tools cannot control all of these factors, and so we find that actual response rates vary by application. However, where single-click emoticon-based survey tools (sometimes called micro-polls) have been used to replace existing survey mechanisms, they have enabled a three to ten-fold increase in response rates. The graph on the right relates to an Inspirometer implementation at Microsoft, where it has achieved response rates as high as 60% in certain service applications.

Is this too simple?

Single click responses using emoticons are as simple and effortless as it is possible to get. This is instrumental in their achievement of industry-leading response rates in any particular situation.
Compiler_meetings_dashboardThis simplicity may at first glance appear to the uninformed as an obstacle to gaining a complete understanding. However, if you combine the increased response rates, with ability to deploy them almost anywhere, these simple snapshots are easily pulled together into a comprehensive and informative picture of service across the organisation. An example of this is the dashboard on the right where they have been used to develop a picture of meeting effectiveness across an organisation.
Simple and easy-to-use survey tools are therefore not restricted to ‘customers’ in the conventional sense of the term, but can also apply easily to employees and communities. This can be applied to: meeting effectiveness; cross-working; culture; creativity; teamwork – in fact it can create a real-time quantification of all of the soft aspects that have been traditionally almost impossible to measure.