A Few Citizen Scientists Do Most of the Work

Inkfish
By Elizabeth Preston
Feb 24, 2015 8:56 PMNov 19, 2019 8:37 PM

Newsletter

Sign up for our email newsletter for the latest science news
 

Nothing turns your internet procrastination time into feelings of goodwill and teamwork like a citizen science project. You can click through a set of penguin photos or moon craters and know that your data are contributing to real science. As more citizens take part, and more researchers discover the joys of free labor, these projects are gaining popularity. But not all citizen scientists pull their weight. In fact, most do nearly nothing. Henry Sauermann, a management professor at the Georgia Institute of Technology, is interested in the economics and organization of science. He's also curious about what motivates scientists. Sauermann and his coauthor, Chiara Franzoni of Politecnico di Milano, thought that citizen science would be "a wonderful new context to think about these general issues," Sauermann says. Sauermann and Franzoni gathered data on seven projects at Zooniverse.org, a citizen science web portal. These included six projects for classifying space pictures (Solar Stormwatch, Galaxy Zoo Supernovae, Galaxy Zoo Hubble, Moon Zoo, The Milkyway Project, and Planet Hunters) and one for transcribing handwritten ships' logs (Old Weather). The researchers looked at participation in each project during its first 180 days of existence. How many people had participated? How many hours did those people spend daily? And how many items did they classify? Across all six projects, 100,386 citizen scientists had participated. Not all projects were equally popular, though. The hottest was Planet Hunters, which attracted almost 29,000 users during its first 180 days. The least cool project, Galaxy Zoo Supernovae, had closer to 3,000 people participating. (Judging by the spikes and lulls in users, though, a project's popularity has a lot to do with whether people have seen it recently in a news article or on social media—galaxies aren't necessarily less popular than planets.) These workers contributed almost 130,000 hours of labor across all projects. To estimate the value of their work, Sauermann and Franzoni calculated how much this labor would have cost if it were done by undergrads at $12 an hour. They also calculated the cost for workers on Amazon Mechanical Turk, who work for pennies a task. Either way they crunched the numbers, the researchers found that the labor completed for free by citizen scientists would have otherwise cost more than $200,000 per project, on average. And that's just for the first 6 months. So far, pretty encouraging for the value of citizen science. But there was discouraging news too. Sauermann and Franzoni found a wide range in how much users participated. Across all projects, the top 10% of contributors did nearly 80% of the work. Most people, in fact, contributed just once and never came back. On average, almost three-quarters of citizen scientists were one-timers. And all the projects got less popular over time, even among the most enthusiastic users. "There are limitations" to citizen science, Sauermann says. Researchers may need to think carefully about how they're using the public. If a certain kind of classification has a steep learning curve, for example, researchers should keep in mind that most citizens won't stick around long enough to get good at it. And since volunteers inevitably drop off over time, they might have a harder time keeping up with a stream of real-time data than classifying archival information. "We need to understand how citizen science works and that it's more promising for some projects than for others," Sauermann says. "And organizers need to spend some time thinking about how to build good projects, and getting and keeping users involved." You can find all kinds of projects to dabble in, from kelp to cancer to climate science, at Zooniverse. If you find yourself among citizen science's elite, you'll impress Sauermann while you're helping out. "I have worked on most of the Zooniverse projects," he says. "But I am not getting anywhere near to the contributions of the top contributors there—it's amazing what some of them accomplish."

Image: from zooniverse.org.

Sauermann, H., & Franzoni, C. (2015). Crowd science user contribution patterns and their implications Proceedings of the National Academy of Sciences, 112 (3), 679-684 DOI: 10.1073/pnas.1408907112

0 free articles left
Want More? Get unlimited access for as low as $1.99/month

Already a subscriber?

Register or Log In

0 free articlesSubscribe
Discover Magazine Logo
Want more?

Keep reading for as low as $1.99!

Subscribe

Already a subscriber?

Register or Log In

More From Discover
Recommendations From Our Store
Shop Now
Stay Curious
Join
Our List

Sign up for our weekly science updates.

 
Subscribe
To The Magazine

Save up to 40% off the cover price when you subscribe to Discover magazine.

Copyright © 2024 Kalmbach Media Co.