Customer Satisfaction Surveys Generally Positive

Discussion
Apr 18, 2011
Tom Ryan

A recent survey found that over one quarter of U.S. consumers have
completed a customer satisfaction survey in the last twelve months with about
half of the responses positive in nature. Over half say they fill out customer
satisfaction surveys to share a good experience (57 percent) or to improve
the company (50 percent).

The survey of over 1,400 U.S. consumers was conducted
by Chadwick Martin Bailey.

The custom market research firm said, unlike the common
perception that people respond to satisfaction surveys only because they are
upset with the experience, the research found that 81 percent respond equally
for positive and negative experiences. Just four percent said they respond
only when they have had a bad experience, whereas 13 percent say they typically
do it only when they have a very positive experience.

“Looking more deeply at the results, we see some unique segments of people
responding to these surveys,” said Jeff McKenna, senior consultant and
director of customer satisfaction solutions at Chadwick Martin Bailey, in a
statement. “This
indicates to me an opportunity for companies to tailor post-survey messaging
and interactions if companies choose to engage these people more fully.”

He
added, “Overall, they are showing devotion to a company and, at the
very least, companies should be thanking the customers who provide insightful
comments, whether they are problems or praises.”

The study also found that
men and women have different motivations. For instance, 60 percent of women
participate in customer satisfaction surveys to share a good experience versus
52 percent of men. Most men (53 percent) fill them out to improve the company
compared to 47 percent of women. The biggest gender differences are around
discounts: 50 percent of women participate to receive a discount compared to
40 percent of men.

Discussion Questions: What do you think of the value of customer satisfaction surveys for retailers or brands? Are there right and wrong ways to conduct customer satisfaction surveys?

Please practice The RetailWire Golden Rule when submitting your comments.

Join the Discussion!

14 Comments on "Customer Satisfaction Surveys Generally Positive"


Sort by:   newest | oldest | most voted
Dick Seesel
Guest
10 years 19 days ago

There are at least two key factors making customer surveys worthwhile. First, the consumer is empowered by the perception that the retailer is actually listening. And, more importantly, the survey results should lead to actual improvement if they show a clear-cut issue.

Perhaps the most revealing number is that slightly more than half of survey respondents responded positively. While the article presents it as a positive, the high “negatives” (almost 50%) show that a lot of retailers have work to do, if they want to satisfy their most vocal and committed customers.

David Livingston
Guest
10 years 19 days ago

The ones online to me are worthless. I click on them as fast as I can, just hitting “excellent” so I can get my miles or free hamburger. Never hit “poor” or say you had a problem. Then they want you to explain. I think most people just want to breeze through them as fast as they can in order to get their small reward.

Bob Phibbs
Guest
10 years 19 days ago

Another reason people fill out customer surveys not addressed here are chances to win, discounts, etc. “Pure” data is still elusive one way or the other.

Carol Spieckerman
Guest
10 years 19 days ago

I wonder if the results differ between surveys that are sent to consumer proactively (email surveys for hotel stays and restaurants that appear in one’s inbox, for example) and those that the consumer must seek out either online or in print.

The problems that I’ve encountered with some surveys (and that make me bomb out) are:
1. Too long
2. Tedious (answering a question a particular way triggers unending drill-downs and/or the survey turns into a research project as I’m asked how many trips I’ve taken, the nature of them, how many nights a year in a particular hotel brand).
3. Incorrect specificity – I’ve found that the more specific a survey is, the less accurate the specificity (surveys that attempt to cover every scenario but end up missing the most important ones)

Marketers risk low response rates and inaccurate results when they don’t keep it short and easy.

Ed Rosenbaum
Guest
10 years 19 days ago

As mentioned in the article, men respond believing their comments will assist the company in improving in certain areas. I am one of those who believe or maybe wish that were the case. None the less, I continue to respond when asked and still remain hopeful I can assist in leading to change when necessary.

We have to believe these surveys have a meaning and a purpose. If done too frequently, we will become jaded and not respond. Or the response we do complete is more critical than need be. Surveys have to be scheduled both periodically and with a different purpose. They all can not be “how did you like the store?” type questions. There must be a meaning and a plan once the survey is complete and results transmitted. If the C level executives do not read them, nothing demonstrably will happen.

Jeff Hall
Guest
10 years 19 days ago

The inherent value in customer satisfaction surveys is foremost tied to a company or brand’s commitment to thoughtful survey design (efficiently gathering the most salient aspects of customer feedback) and judicious use of the resulting insights for ongoing customer engagement and performance improvement.

A key aspect of this is understanding how to balance the brand’s internal, generally operationally-driven perspective (the company lens) with the consumer’s expectations and perceptions of their actual experience (the customer lens) in order to ensure the resulting business intelligence is both relevant and actionable to every level of company stakeholder.

Optimally designed surveys not only take this balance into consideration, but also accurately identify the core elements of actual customer experiences that will drive advocacy and loyalty. This allows an organization to clearly understand where to focus its time and effort to be most customer-centric.

Eliott Olson
Guest
Eliott Olson
10 years 19 days ago

The survey shows that most people are considerate in their answers. There will always be outliers who like to play games and they can be eliminated through statistical modeling and the establishment of baseline data just as caller id spoiled their fun in asking the drug store if they had Prince Albert in a can.

Camille P. Schuster, PhD.
Guest
10 years 19 days ago

People may choose to respond to a survey for a number of reasons. Making an assumption about why people choose to respond is not good science and does not provide an accurate foundation for strategy development. There are a LOT of customer satisfaction surveys bombarding consumers and they probably do not answer all of them. It might be interesting to do a survey to find out why consumers respond to surveys and how the decide which ones to answer.

Ryan Mathews
Guest
10 years 19 days ago

I guess the answer depends on the survey.

Let’s look at customer satisfaction surveys like you find on Amazon and eBay. If you want to give somebody perfect marks, no problem. If you want to give them less them perfect marks you have to take the time out to type in some observations. If you don’t, you can’t submit the survey.

So…my cynical bet is that some percentage of consumers just end up rating everything highly so they can get out. Of course I assume another substantial segment just quits. In either (or both) events you aren’t creating a very balanced sample.

Fabien Tiburce
Guest
Fabien Tiburce
10 years 19 days ago

The problem with solicited feedback is that it tends to be artificial. As others have mentioned, it is generally motivated by self-interest (get a coupon or discount). For this reason, I believe the mast majority of solicited feedback to be biased and generally unreliable. Unsolicited feedback however (customer complaints, calls or emails to a brand’s customer service) is “raw” and genuine because it was triggered by a positive or negative customer experience worth commenting about. My recommendation would be to pay more attention to social media comments and tickets captured by your customer service department, and pay less attention to studies based on solicited feedback.

Craig Sundstrom
Guest
10 years 19 days ago

If I understand this correctly, CMB conducted a survey to see how many people participate in surveys (?)…suffice it to say that the type of people who don’t answer “real” surveys probably didn’t answer this one either; so yes, there is a definite problem in selection bias.

M. Jericho Banks PhD
Guest
M. Jericho Banks PhD
10 years 19 days ago

I duck all surveys, but the most difficult to avoid are those at the ends of telephone customer service calls or during in-store purchases. Without being rude, how can you be abrupt and dismissive to a sweet lady who’s helped you solve a problem? At Dell or AT&T, it’s “On a scale of one to five, how would you rate my service to you today?” Whadda’ ya’ gonna’ say? (Too many apostrophes?) You can’t just hang up. You give them a five every time. And at Staples while checking out, when the checker looks at you winsomely and asks why you don’t sign up for their frequent shopper program, don’t you feel compelled to respond? After all, these folks receive bonuses for high performance ratings and for signing shoppers up for frequent shopper cards. Would you deny them their bonuses?! I just can’t. I’m putty in their hands.

Lee Peterson
Guest
10 years 19 days ago

Surveys are good as a piece of the puzzle, but not the entire answer. There should be more in-depth periodic qualitative and the occasional (annual?) mind-blowing quant study as well. But most importantly of all–the execs need to be out in the stores, talking to customers and associates all the time (weekly). To me, nothing is more important than that last one.

Even the best surveys can only tell you so much. It’s the concerted, regular effort to hear and see your customers and associates from all angles by all key execs that will have the most impact. The idea is to “know” your customer vs occasionally guessing what it all means. You can understand a little about a customer by reading a survey, but then you can go out and meet them and the people that serve them every day to get a MUCH better idea.

Tim Henderson
Guest
Tim Henderson
10 years 18 days ago
When used properly, customer satisfaction surveys can be useful tools in helping brands improve their products and services. The key is in being willing to actually use the data to find out what works, what doesn’t and how the brand can improve. As this survey indicates, customers are willing to tell brands such info, but it’s up to brands to actually use the info. As for right/wrong ways to conduct surveys, my only suggestion is to field the survey – or slight variations of the survey–among key customer groups, e.g., members of the loyalty/reward program, in-store shoppers, online shoppers, etc. Different groups will have different needs, desires and experiences with the brand that need to be addressed. One quibble with the Chadwick Martin Bailey folks who use “devotion” to describe consumers who respond to satisfaction surveys. I’d say it’s more “engagement,” i.e., the respondent has indicated a willingness to engage with the brand, not a devotion to the brand. If the brand actually uses the respondent data to refine the shopping experience, perhaps they’ll then… Read more »
wpDiscuz

Take Our Instant Poll

How often should retailers be asking customers to fill out or participate in a customer satisfaction survey?

View Results

Loading ... Loading ...