Feel free to thump your chest and exchange high-fives before Sunday's big game, because thanks to crowdsourcing, common folk have outsmarted the ESPN experts. This past summer, a crowdsourcing company called Crowdflower wanted to see if the wisdom of crowds could best ESPN pundits by making better predictions of the season's best football players. Against the power of crowdsourced labor from Amazon's Mechanical Turk site, the ESPN list didn't stand a chance. The results show that the crowdsourcers beat the experts hands down, and the outcome is especially clear in the top 25 players' ranking. New Scientist reports:
Before the season started, Crowdflower had 550 workers vote on which one of a pair of players would be the more valuable member of a fantasy league team. Stats on the players were available for those who wanted help, but complete novices were warned off. "If you think football is a game where you're really only allowed to touch the ball with your feet, this probably isn't the job for you," read the advert.
But how exactly does crowdsourcing harness such soothsaying powers? From New Scientist:
"The simple answer is that we got answers from a large number of individuals, so the influence of one individual's bias is smaller," says Crowdflower's Josh Eveleth. "People who were uninformed would tend to cancel each other out, so any significant trend would be meaningful. We had a much larger pool than ESPN did. And because our crowd responded independently of each other, they were less likely to be influenced by groupthink than the ESPN experts."
We wonder if the NFL will get the memo, and start crowdsourcing its draft picks... Related Content: 80beats: Crowdsourced Science: 5 Ways You Can Help the Hive-Mind 80beats: Crowdsourced Astronomy Project Discovers “Green Pea” Galaxies 80beats: Finally: N.F.L. Issues New Concussion Rules To Protect Players’ Brains DISCOVER: What Happens to a Linebacker's Neurons? DISCOVER: Who Really Won The Superbowl?
Image: flickr / Ed Yourdon