When surveys behave badly … and what to do about it

Through a special arrangement, presented here for discussion is an excerpt of a current article from the Joel Rubinson on Marketing Research Consulting blog.

Bad survey results are toxic. They lead to false beliefs, inaccurate measurement and bad consumer segmentation that injure rather than nurture innovation and media strategies. "Fixing" a survey starts with good survey design principles, but it doesn’t end there. In the digital age, we need to "think beyond the survey" and integrate behavioral information.

Here are three reasons that surveys behave badly and what you can do about it.

Problem: Telescoping. Surveys always elicit overstatement on brands bought over the past year, past three months, etc., leading to inaccurate estimates of market penetration and misidentifying users.

The fix: Anchor in behavior. Minimize telescoping by first asking about longer timeframes, then following up with the shorter timeframe you are interested in. To remove discrepancies, calibrate (rim weight) the data on known brand incidence. Selected suppliers now offer the ability to survey respondents whose behavior is known from scanned purchases, frequent shopper data matching, etc. creating a true single source approach. In subscription or retail businesses, you can also survey your own customers and match in transaction and clickstream data.

Problem: Random answering of attitudinal questions when beliefs are weakly held. Often attitude and usage studies ask questions that the respondent doesn’t really know how to answer, but guess what? They answer the question anyway! When you conduct consumer segmentation off of such data, the segments seem to make sense, but, sadly, via a test-retest reliability experiment, you might find that the same respondent only has a 50 percent chance of falling into the same segment the second time around. Such segmentation is practically useless for media targeting.

The fix: Anchor segmentation in digital and shopping behaviors. Survey people on fewer but meaningful statements and about media and purchase behaviors that are known. Create segments that are maximally different in purchasing behaviors and media habits as well as attitudes. That way, key segments can be targeted and personalized differently in digital media and will hold up over time. Finally, did you know that you can now listen to social media specifically for a segment of interest? The technology now exists to do profile matching to accomplish this.

Problem: Inaccurate recall of time spent on various media behaviors. In my experience, people underestimate the time spent watching TV and over-estimate the time spent on various digital behaviors, notably social media.

The fix: Go single source. Conduct survey research among people whose behaviors are metered or recorded in some way. An award-winning study I consulted on "Seven Shades of Mobile" did this; we surveyed people whose smartphone behavior was being metered.

The new news? In a digital age, think beyond the survey. Conduct surveys among those where clickstream behaviors, transaction and shopping behaviors, social media profiles or Facebook likes are measured.

BrainTrust

Discussion Questions

What do you see as the most common causes of inaccuracies in surveys? Will the use of social media and digital technologies enhance survey effectiveness?

Poll

10 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
David Biernbaum
David Biernbaum
9 years ago

At David Biernbaum & Associates LLC, we totally agree with every point made in this article. In fact, I would go as far as to say that most consumer goods surveys today are non-scientific, far too random and yes, often single-sourced. For example, a survey in the office, or strictly online, or by asking questions to only those that volunteer to be surveyed, are extreme flaws. If your goal is truly to get accurate results, you need to outsource a survey expert, best if it’s outside of your organization, and be sure to use the criteria mentioned in this article.

Dr. Stephen Needel
Dr. Stephen Needel
9 years ago

Joel’s points above suggest that surveys are inherently inaccurate. It’s not because people are lying (although that happens), it’s that they don’t recall the correct answers to our questions or we’ve asked the questions badly. So there’s a cautionary note for all you DIYers.

I don’t think social media and digital technology per se help. Measuring behavior unobtrusively does, though, and sometimes that behavior is social media based or digitally based. When we test shopping behavior using virtual reality, we are all about the shopping, not a question about the shopping.

Roger Saunders
Roger Saunders
9 years ago

Joel’s common causes accurately point to issues that lead to flawed results on surveys, be they custom or syndicated. Other flaws creep into the equation for Marketers as well.

Too often they fail to take a longitudinal approach. If an issue is important enough to provide deep insight into a problem or opportunity, consider capturing a tractable database to see if your target audience is behaving consistently.

Poorly-written questionnaires lead to interview bias and respondent fatigue. Read and follow Princeton Psychologists and author Daniel Kahneman’s “System One Thinking” and “Think Fast, Think Slow.” Don’t make the question complex, the respondent is helping you.

Make the survey scalable. Having a survey of 100 respondents may not be enough to make a bet on a $10 million project, let alone a $100 million-plus one. Very few products or services have such a tight prospect group that you only need one segment

Try to find out about the “complete” respondent: Their sentiment, behavior, AND future spending plans for the category. Everybody knows how many “black socks were sold last Christmas”. What the smart marketer is seeking is solutions to what is going to happen this Christmas.

Social and digital technologies have vastly altered and enhanced survey effectiveness. A short 15 years ago, 95 percent-plus of all surveys were completed via diary, telephone, or mall interrupts. Those methods may still have a place, but the smart money placed their chips on online survey methodology at the turn of the century. The caution on digital, be certain that you are not receiving a machine-to-machine response. We’re all still selling products and services to human beings. Be sure you’re listening to them.

Gajendra Ratnavel
Gajendra Ratnavel
9 years ago

The inherent issue with surveying is you get answers that a person wants to tell you and not necessarily what you want to know. In some cases they work really well but they need to be created carefully with identification of areas where there might be inaccurate responses.

It should also be backed up or supplemented with other data where possible before any important decisions are taken as a result.

Ralph Jacobson
Ralph Jacobson
9 years ago

Survey questions must be reviewed by a team of editors to ensure a minimal level of ambiguity in the line of questioning. Only ask open-ended questions if you truly want “open-ended” responses. Otherwise, make the questions tactical and focused, with little or no room for interpretation of intent.

Will the use of social media and digital technologies enhance survey effectiveness? Not necessarily. I see these channels as additional vehicles for survey delivery, however the content of the survey itself is the key.

Kevin Leifer
Kevin Leifer
9 years ago

Great points in each of the comments here. My biggest word of caution would be to keep the survey focused on a truly defined objective and avoid tendencies to ask “a few extra questions” someone may be curious about. “Surveys by committee” often grow to become out of control and dilute the essence of the objective.

Jonathan Marek
Jonathan Marek
9 years ago

Lots of interesting points here, but I think commenters haven’t hit on the most fundamental issue: with respect to most of the surveys we conduct in retail and CPG, most survey respondents don’t themselves know why they behave the way they do or what will change their behavior. This is unlike much simpler surveys about clear decisions, e.g., for whom will you vote? And we all know how even those polls are far from perfect.

As a result, marketers cannot get what they really want from surveys—an understanding of what levers will drive consumer behavior. They might get ideas, but they should treat them with lots of skepticism.

This is why the world is moving towards more data analysis of actual behavior and more quick cadence, real-world testing. The real measure of whether a business idea works is how consumers really truly behave when presented with the idea, not how they say or think they will behave.

W. Frank Dell II, CMC
W. Frank Dell II, CMC
9 years ago

Wording of the questions is the biggest problem after sample design. Too many surveys lead the respondent. This occurs due to the way the question is formed and then asked. For really critical issues, it is best to ask the question two ways. Another problem with digital surveys is getting a valid sample for projecting. The survey must target a broad reach to be valid.

Joel Rubinson
Joel Rubinson
9 years ago

Wonderful comments, thanks all. Let me give another dimension, which is the action levers uniquely created by integrating digital and social data. Currently, a CPG marketer might conduct a BASES test to determine the sales potential of a new product and present that to the retailer as evidence for listing, but that is pretty much the extent of it. Suppose a retailer had a research panel of shoppers where surveys could be linked to both frequent shopper data and clickstream behaviors. Now we could test a concept and then passively capture who the triers are for that new product. Then, the data science angle would be to use ALL available data to model who the triers are. This would tell the retailer/manufacturer how to personalize outreach to “lookalikes.”

Just one of the ways you need to think differently about integrating surveys and behavioral data.

Gordon Arnold
Gordon Arnold
9 years ago

Asking questions that are not able to allow for an opinion from the subjects leave no or limited results. This is common, especially in quick surveys. Small and mismanaged samples create large possibilities for error(s). This and more combined with management teams that are poorly schooled in statistics are the reasons marketing teams produce so many surprises.