Group Consensus

3 13 19 Feature Group

Shannon Roddel | March 13, 2019

If you consult Angie’s List before hiring a plumber or a landscaper, Yelp before making a reservation at a new restaurant or Consumer Reports before upgrading your electronics, you’re not alone.

The growing number of crowd-sourcing sites shows the extent to which consumers rely on popular opinion. New research from the University of Notre Dame has found a way to improve their accuracy.

“Harnessing the Wisdom of Crowds” is forthcoming in the journal Management Science. Written by Zhi Da, professor of finance in Notre Dame’s Mendoza College of Business, and Xing Huang of Washington University in St. Louis, the study examines the effect of “herding” on the accuracy of quarterly earnings estimations on the crowd-sourcing platform Estimize.com. Estimize provides quarterly earnings-per-share estimates for publicly traded companies by some 86,000 professional analysts, amateurs and students.

“We analyzed individuals’ estimates of quarterly corporate earnings and found that the average of their estimations becomes more accurate when these individual estimates are made blindly or concurrently,” Da says. “When people have access to others’ estimates as they make their decisions, they tend to ‘herd’ with the group and the average group estimate can actually become less accurate. In essence, we become ‘individually smarter but collectively dumber.’”

The researchers worked closely with Estimize to track and randomize the information sets of users, allowing them to cleanly isolate the impact of herding. The data came from 2,516 Estimize users who made estimates ahead of 2,147 earnings releases from 730 firms.

Read more here.

 by Daily Domer Staff

Posted In: Features