Ten Things to Worry About When Writing Your Next Questionnaire

We’ve just completed another wave of text analytics projects for some of the UK’s more innovative Marketing and Customer Service directors. Beyond the standard structured surveys (aka, “tick box questions” or 1-to-9 scales), unstructured (or freeform verbal) answers allow the respondents to give you more information to the question you asked.

These kinds of questions also allow the respondents to answer the questions they wished you’d asked them.

As we start to turn up more valuable insights, we’re being asked not just to analyse the responses, but to write surveys which are specifically designed to mix structured and unstructured data, in order to increase the quality and quantity of information our clients are receiving.

Our team’s expertise in communication design, training in sociology, our reading of behavioural economics (plus of course many late nights sweating our results) has led us to formulate this list of 10 things to bear in mind when you’re next writing a survey…

1.     Make sure your questions provide real alternatives for the respondent to explore.

While you might think a “scale of one to nine” style question for exploring someone’s opinions on an issue provides a little more dimension to proceedings, it is still structuring.  And your structure is their restriction. You’re only providing one emotional axis for your questionee to think about. And since this is usually “good/bad”, you’re opening yourself up for information bias.

2.     Avoid providing answers where any single alternative is dominant.

For example, if you ask “Would you rather have (a) five pounds or (b) a punch in the face?” you should probably not be blown away when 9/10 Investor’s Chronicle readers prefer (a).

 

Hugo Weaving demonstrates this innovating new approach to saving.

Hugo Weaving demonstrates this innovating new approach to saving.

 

Instead, offer a range of alternatives to assess how the customer really feels about what you’re offering. Even better, make the question open ended, for example: “What monetary remuneration or gift would you like instead of a punch in the face?”

3.     Ensure the survey takes into account the context in which the respondent will be answering the questions.

This includes things like the place and time the questions will be answered (if it’s on Monday evening, do not expect a glowing review). It also includes things like your customers’ average spend and salary. Ask questions relevant to the customer’s circumstances, and you’ll get better results.

4.     Account for Social Desirability Bias.

People tend to answer questions in the way they think others will see as “correct”. This can mean over-reporting good behaviour, under-reporting bad behaviour, or straight up lying about purchasing habits to maintain “face”. The classic example is claiming to favour a “rich, dark roast” when buying coffee. Actual purchase data shows that they almost always buy something more weak and milky.

You can get around this one by asking questions indirectly and having your analysts join the dots, or by adopting a tone of voice that your audience is likely to respond more honestly to.

5.     Give the survey an intuitive and easily-to-appreciate “Flow”.

Many CRM surveys we’ve seen recently follow a chain of logic useful only to the survey designer. If a document is structured in a way that makes sense to the reader, they’ll be more likely to read the next section.

6.     Have methods to measure the implicit associations between answers to different survey questions.

This is one way to spot/eliminate Social Desirability Bias and similar issues prompted by context. The way an individual responds to all the questions in a survey can sometimes tell you more about them than their individual responses. When considered statistically over a large number of respondents, this can reveal trends with potentially enormous commercial value.

We’ve seen this recently in our text analytics work with a leading UK retailer. Respondents giving an NPS score of 5 or below were much more likely to use certain words and phrases in their verbatims which were in no way related to qualities of “good” or “bad”. This allowed our linguistic analysts to draw sensible conclusions about the problems that really underlie these sentiment scores.

7.     Make sure your survey offsets the “Primacy” problem.

People pay less attention to the middle items in a list. We’ve found a good way to stop them doing this is to leave tags between questions that motivate or direct their attention. “Now we’ve talked a bit about our stock, we’d like to ask some questions about the general appearance and layout of our stores. Don’t worry, this won’t take very long…”

8.     Allow full qualitative expression of opinions on complex issues.

“How do you feel about the current Syrian situation?”

“Oh, I’d give it a six. Maybe a seven on a good day.”

If you’ve never had the above conversation, you’ll get where we’re coming from here. Many issues are simply too complicated to deal with from a dropdown menu of pre-written responses or a numerical scale.

Until recently, dealing with these issues was a huge problem for pollsters. Fortunately, we now have the wonderful tools of text analytics and metadata analysis to help us draw these conclusions. There’s simply no question to this: If you want sensible answers to the complex questions, you’ll have to use text analytics in conjunction with totally freeform survey questions.

9.     Make use of freeform questioning to ask the questions your customers want to answer.

Another of text analytics’s innovations is the ability it gives you to ask extremely vague questions and still get valuable answers. Traditional polling has generally been geared to asking the questions the pollster wants answered. However, this can itself sometimes distort responses, as it leaves the person doing the answering frustrated that they cannot express the opinions they really want to be heard.

Text analytics can eliminate this distortion by prompting respondents to speak freely about what they want to speak about. These answers can then be taken together to deduce opinions on other, more specific issues.

10. Let your analysts design the questions, but let your copywriters write the survey.

Although its primary purpose is gathering information, your survey also acts as a connection point between you and a customer. It should therefore be working just as hard to engage them as your website does, and as your advertising does. This means it requires just as much attention in its actual composition as it does in the questions asked.

And the best thing about following these ten steps is that you’ll not only have a more engaging and less biased survey. You’ll also have the ability to tap into the almost unlimited potential of text analytics for marketing.

And that’s where we come in.

If you’re far too busy to remember all ten of these points (or have forgotten numbers 4 through 8 because of the primacy effect), please contact Chris and we’ll be happy to share some recent case studies in text analytics and advise on design your next marketing or customer experience survey.