Is there really wisdom in the crowd?

Discussion
Photo:@lelia_milaya via Twenty20
Feb 11, 2019
Avatar

Knowledge@Wharton

Presented here for discussion is a summary of a current article published with permission from Knowledge@Wharton, the online research and business analysis journal of the Wharton School of the University of Pennsylvania.

As predictive analytics become more sophisticated, companies are increasingly relying on aggregated data to help them with everything from marketing to new product lines. But how much should firms trust the wisdom of the crowd?

In his latest research, Wharton marketing professor John McCoy proposes a new solution for crowdsourcing that can help create better, more accurate results: Instead of going with the most popular answer to a question, choose the answer that is “surprisingly popular.”

“If you think about doing something like majority vote, what you’re doing is just taking the most popular answer,” said Prof. McCoy in an interview with Knowledge@Wharton. “You’re taking the most frequent answer that people give. We say instead that the right thing to do is take what we call the surprisingly popular answer.”

His paper, titled “A Solution to the Single-question Crowd Wisdom Problem,” suggests the crowd should be asked for two responses to a query: their own answer and their predictions about the answers of others in the crowd. Prof. McCoy added, “Then, taking the surprisingly popular answer means looking at both the actual vote frequency and the predictive vote frequency, and choosing the answer where the actual vote frequency exceeds predictive vote frequency.”

As an example, he noted that only a third of a crowd of MIT students guessed right that Philadelphia wasn’t the capital of Pennsylvania. (It’s Harrisburg.) But even less, 23 percent, felt overall survey respondents would agree that Philadelphia wasn’t the capital.

Prof. McCoy said some companies are adapting their methods for extracting crowd-sourced information by weighting votes by competence and using prediction markets. Seeking two inputs — the individual’s own answers and their predictions about the answers of others — can remove some of the bias that comes from a person recognizing they’re being asked a question.

As shown in the “Is Philadelphia the capital of Pennsylvania?” question, the duo-question technique also helps identify the correct answer for queries where the majority of the public answers wrong. Said Prof. McCoy, “I think one big lesson is that the crowd is a lot smarter than many people give it credit for.”

DISCUSSION QUESTIONS: Does it make sense that crowd-sourced “surprisingly popular” answers are often more accurate than the majority vote? When is crowdsourced data most useful for research purposes?

Please practice The RetailWire Golden Rule when submitting your comments.
Braintrust
"The best way to get accurate crowdsourced datasets is to test your methods and increase your sample size. The more answers, the better."
"Be careful putting too much credibility into “mob mentality.” Focus on real-time personalization and deep shopper analytics..."
"There are plenty of examples where the crowd has out-performed other methods of problem solving."

Join the Discussion!

19 Comments on "Is there really wisdom in the crowd?"


Sort by:   newest | oldest | most voted
Dr. Stephen Needel
BrainTrust

There has never been much evidence to support the notion of the “wisdom of the crowd.” Social psychologists have know for years that the crowd usually devolves into its lowest common denominator when behaving like a crowd. Behaving individually as a crowd only produces a “better” answer under a very specific set of circumstances which turn out to almost never be the case. But maybe the bigger question should be, “Aren’t we appalled that only a third of MIT students know that Philadelphia is not the capital of Pennsylvania?”

Craig Sundstrom
Guest

Dare we ask how many know what the capital of Massachusetts is? (Though lucky for them, the easy answer is the correct one … perhaps an example of your “specific circumstances.”)

Jeff Sward
BrainTrust

I think there is a difference between measuring opinions and measuring actions. In apparel, if a customer is surveyed about a given style then “love it!” is a very easy answer to give. But how many “love it!” responses actually result in a purchase? In my experience, that’s how a lot of merchandising and editing meetings go off the rails. “Love it!” becomes the driver. Nobody wants to be a naysayer to the design team. But that doesn’t necessarily create the best commercial assortment. It’s a tricky balance to strike. Great fashion story telling AND balanced for risk.

Cynthia Holcomb
BrainTrust

Good point Jeff, “great fashion storytelling.” Love it is the Pinterest mantra! Aspiration vs. Inspiration vs. the Reality of the real world, we buy clothes we like to wear on our bodies matching our fit, look and feel preferences. Lack of preference matching is always evidenced by the price tags hanging in one’s closet.

Min-Jee Hwang
Guest

As humans, we’re influenced by countless things, most of the time without us even realizing it. The best way to get accurate crowdsourced datasets is to test your methods and increase your sample size. The more answers, the better.

Dr. Stephen Needel
BrainTrust

That increasing your sample size gets you better answers goes against everything we know as market researchers and statisticians.

Ryan Mathews
BrainTrust

I’ve often wondered where the idea of the wisdom of the crowd came from since crowds are also the source of mass hysteria, mobs, etc. But, on an only slightly more serious note, in an age of social media that spreads memes more rapidly that the flu virus — and often with more harmful effects — the “wisdom” crowds embrace is increasingly a form of amplified feedback, a reflection of what the crowd is hearing the crowd think and say. So, I think the surprisingly popular concept is a valid one. After all, you want to be at the head of a trend, not its echo.

Ian Percy
BrainTrust

Thanks for putting the word “wisdom” in quotes. At least I’m assuming we all use quotes that way when we know the word is being misused.

James Tenser
BrainTrust

Great observations here. Even the clever question, “What would others say?” may be subject to the echo-chamber confound.

The “surprisingly popular” crowd response reminds me a bit of Fred Reichheld’s “likelihood to recommend” question. While crowd wisdom may be convenient to measure in this way, it may also lack diagnostic utility.

Mohamed Amer
BrainTrust
Mohamed Amer
Independent Board Member, Investor and Startup Advisor
5 months 6 days ago

The core assumption here is that there is wisdom in the crowd that can be extracted and that novel ways of asking questions can increase the utility and predictability of the outcomes. The work by professor John McCoy and colleagues takes crowdsourcing as having predictive powers that can be further refined and improved upon by going beyond the simple majority outcomes, and asking the right type of questions and combining the results.

This research suggests that crowdsourced data – properly collected and analyzed – can be a powerful predictor for new product introductions, new service offerings, with potential consideration of the contextual setting.

Kenneth Leung
BrainTrust

The interesting thing about crowd-driven answers is that it is based on current perception and knowledge. I don’t think crowd sourcing would have predicted the iPod, iPhone or social media because those were disruptive innovation. If you crowd sourced the answer to whether the earth goes around the sun in the 15th century, not sure you will get the right answer either. 🙂

Cynthia Holcomb
BrainTrust

Crowdsourced data is generalized data, skewed by individual human interpretation of the question[s]. Layering “surprisingly popular” on top of crowdsourced data might work for static questions, like what is the capital of PA, but really, do we need to use crowdsourcing to discover the capital of PA? “Surprisingly Popular” and “Crowdsourced” data is a snapshot in time, not forward-looking. Especially fleeting are the “Popular” insights of the crowd. Good research queries are based on deep insights into the business question the researcher is truly seeking to solve.

Shawn Harris
BrainTrust

There are plenty of examples where the crowd has out-performed other methods of problem solving. Recall Joy’s Law of Management: “no matter who you are, most of the smartest people work for someone else…” — Bill Joy. Even NASA uses crowdsourcing to solve some of its most pressing problems, awarding over $20 million in award money. Also, there is an AI firm, Unanimous AI, who has successfully leveraged human crowdsourcing plus AI to surface counterintuitive valuable insights. It works!

Doug Garnett
BrainTrust

First, what Prof. McCoy is talking about is not about crowdsourcing – it’s a fundamental principle of marketing research. Too many companies accept the most popular answer. But superb research teams are always looking for the surprises — those things that reveal something unexpected that gives us more insight.

Then, he suggests shifting to a wickedly bad research approach. One- and tw0-question research does not reveal anything useful because people are too complex for it.

To give him credit, he’s attempting to reduce a massively complex problem to smaller bits — and is using a common sociological approach. But this approach is rarely successful. Because, as I came so see reading Rick Nason’s book on complexity when you face a complex problem and believe it can be solved by reduction, those answers always fail you.

Evan Snively
BrainTrust

I’d be interested in understanding the usefulness of this application in factual vs. opinion questions. In the given example about Philly the question asked is a factual “right or wrong” answer, so there it is less abstract in the sense that the participant doesn’t need to weigh two or more preferences that might both have positive and valid outcomes. Think: I like style A of a dress the most, although I like style B nearly as much. People might over-index on thinking they are unique and answer the second question more conservatively in that case. Or if they were made aware that they are comparing their opinion vs. what they think the opinion of the masses is, that might alter their answer to question one in order to appeal to their sense of individuality. Those complex problems aren’t necessarily present in a “true/false” scenario.

Ian Percy
BrainTrust

Let’s get some definitions straight here because it makes a huge difference. This article and the approach it’s advocating has absolutely nothing to do with wisdom! At best it has to do with information. Apples to oranges for sure.

Wisdom is future-focused. It’s a rare gift that accesses information, experiences, intuition in order to guide us effectively into the future where there is no data – only possibilities. You’ll never see the word “Wise” on the list of qualifications for any job including … well, you know which one. Wisdom is the rarest of all attributes and the one we should all aspire to.

Ralph Jacobson
BrainTrust

Be careful putting too much credibility into “mob mentality.” Focus on real-time personalization and deep shopper analytics to find what really matters to your audience.

Patricia Vekich Waldron
Staff

Using crowd-sourced data is another great way to inform decisions, but its use at this point is still part art and science, says this former Philadelphia resident. 🙂

Craig Sundstrom
Guest

I didn’t read the study, but I’m not sure a question about the capital of Pennsylvania — i.e. a “right/wrong” question — has much applicability in retail. People are usually quizzed about their habits or beliefs, not factual issues, and reason they’re being asked is we believe they’re the one’s most likely to know the answer. Now of course those familiar with research know responses often aren’t accurate, and use their knowledge to try to translate what people said into what they SHOULD have said, but I’m dubious the “vox populi” will come up with equally good info … so basically either GIGO or “you get what you pay for.” Take your pick.

wpDiscuz
Braintrust
"The best way to get accurate crowdsourced datasets is to test your methods and increase your sample size. The more answers, the better."
"Be careful putting too much credibility into “mob mentality.” Focus on real-time personalization and deep shopper analytics..."
"There are plenty of examples where the crowd has out-performed other methods of problem solving."

Take Our Instant Poll

How valuable do you find crowdsourced data versus focus groups and other forms of traditional customer research?

View Results

Loading ... Loading ...