The Hardest Questions, Part I: Why Surveys Usually Fail

Arthur Poropat
6 min readAug 10, 2022

Whether they’re used for concept testing, customer satisfaction, or staff engagement, surveys have long been an extremely useful business tool. There have been millions conducted since the first modern social surveys a couple of centuries ago, so you’d think people have had plenty of time to figure out the best way to use them. Yet, how many times while answering a survey have you asked yourself, ‘What idiot wrote this stupid question?’

Neon question mark.
Photo by Emily Morter on Unsplash

The chances are that many of those ‘idiots’ were highly-trained professionals. Just think how bad a survey can be when written by amateurs!! Here are just some of the mistakes people make with surveys.

Too many questions

People often include far too many questions in their surveys, an example of the kid-in-the-candy-store effect. For kids in a candy store, someone else (usually a parent) is paying, so it’s easy to ask for more and more. With surveys, the people who ‘pay’ are the respondents, those customers or staff or members of the public who answer or ‘respond’ to a survey. That’s why survey designers keep asking just one more question, then another, and another, until the whole survey seems to go on, and on, and on…….

Just like parents who ultimately say no to their candy-monsters, respondents have ways of letting designers know they’re asking for too much. It’s often as simple as not answering the survey or stopping half-way through, but that’s not the worst of it. Many respondents start rushing their answers while others get grumpy enough to take revenge on the designer by giving nonsense responses. Rushed or deliberately misleading answers are hard to spot and create real problems — imagine trying to separate useful responses from nonsense and you’ll have some idea why the few people who try usually fail.

Of course, few survey designers are stupid enough to create the 58-page monster Adam Ramshaw wrote about, but even a one-page survey can be too long in many situations. Respondents will notice if you’re wasting their time so with regards to survey questions: when in doubt, leave it out.

Questions you shouldn’t ask

One way to make surveys shorter is by removing questions that should never be asked. It may seem obvious, but surveys often ask about things the survey designer should already know, like asking someone who’s already been looking at your company’s webpage if they know about your company’s products.

More important is avoiding intrusive and irrelevant questions. So, although the recent shift from binary (male-female) to multi-option (etc., etc.) questions about gender may seem respectful, survey designers should first ask themselves if they should ask about gender (or age, or race, etc., etc.) at all. In any case, while some things are clearly gender-linked, gender is mostly associated with people’s choices for other reasons.

For example, let’s assume that men are more likely to buy electric drills. Rather than assume gender is the reason, it would be more useful to explore why anyone wants a drill — or doesn’t. Asking a question about gender (or age, or family, etc., etc.) is simply lazy and often stops people asking questions that would be truly useful.

So, figure out what you really need to know instead of asking pointless or intrusive questions.

Biased questions

How would you feel about a question that asks how often you punch your partner each week, then only allows you to respond with the following options: 1–3; 4–7; 8 or more. The first time I came across a question like this was in an experiment where the researchers were wanting to make the respondents angry, and it worked for obvious reasons.

Obviously biased questions annoy people, yet somehow, survey designers still use questions them. I recently came across a survey asking about a hotel I’d visited, which included a question about my overall rating of the establishment. That would normally be fine except the responses ranged from ‘satisfactory’ to ‘excellent’. Rather than forcing me to give a positive rating, it made me close the survey. Others may not respond so nicely, so check your questions for bias.

‘Impossible’ questions

Survey designers often assume people are far more self-aware than they are, asking questions for which people usually don’t know the answer. Most people struggle to remember what they had for breakfast, so they are unlikely to accurately recall the true answer to questions like ‘Why did you choose our product?’ to ‘Where did you first hear about our product?’ That won’t stop them trying, but the answer you’ll get will be their best guess, which is probably no better than yours. Instead, stick to questions people can answer, such as what they like about something or what problems they’ve had.

Confusing questions

It’s important that survey questions are clear, so many survey designers try to make their questions highly specific. While good in principle, this often results in complex questions, like:

‘When compared with the last time you bought a similar product from a similar retailer, how well informed were you about the features and benefits of this product for your personal needs and circumstances over the next few months and years?’

People also write confusing questions when they want to reduce the number of questions without reducing the information they get. That often leads to double-barrelled questions, like:

‘How would you rate our efficiency and service?’

One of the trickiest sources of confusion is the language itself. All languages are somewhat ambiguous, which often leads to questions that meant one thing to the designer and something else to people responding to the survey. For example, a few years ago, I completed a survey that asked about my level of acceptance of various sexual activities. I’m highly accepting of what others do, provided they’re not forcing things on me, but the survey designer was interested in people’s acceptance of personally participating in those activities. Same wording, very different meaning, which resulted in survey feedback that could not have been a more inaccurate description of my lifestyle.

Asking the wrong people

These problems with questions are bad enough, but even the best surveys will give the wrong answers when sent to the wrong people. It’s a lot like trying to measure a Toyota by looking at a Honda — you’ll end up with results that are precisely wrong.

Yet sometimes, those measures will be perfect because the cars were designed for the same purpose. You’ll just never know until you measure the right thing.

Surveys aren’t exactly measurement but the same principle applies If you want to know how your staff are thinking, ask them and not your managers; if you’re interested in adolescent attitudes, ask teenagers and not teachers.

Seems obvious, until it’s not. I’m regularly sent surveys about products I’ll never use, designed for interests I don’t have, from organisations I don’t care about. While some of those surveys are spam or worse, many of them are legit: just incredibly badly targeted.

I’m aware that many of these surveys are sent on the principle that any information is better than no information, but that is only true if the information is relevant. You’ll never know if it’s relevant without asking the correct people.

Conclusion

These are just some of the more common ways that surveys go wrong. The people who make these mistakes are usually well-intentioned, yet I’ve seen the same problems in surveys from large corporations, news outlets, government agencies, and especially from small businesses and startups. The culprits are often very well-trained: one of the examples comes from a text-book written by a leading researcher in psychological science. In other words, none of us are safe from writing really bad survey questions that leave people wondering about your mental ability, while producing deceptive nonsense.

The main cause for these problems is simply that people are weird: you can’t measure a person or their ideas the same way that carpenters measure a piece of wood or electricians measure electrical current. However, there are some relatively simple ways to fix surveys to make it much more likely that you’ll get something useful. That’s what I talk about in The Hardest Questions, Part II: Making Surveys Work, but feel free to get in touch if you’d like to talk about your questions.

--

--

Arthur Poropat

Arthur’s work on personality, leadership, & performance helps people work together, bringing the best out of each other.