We have completed maintenance on DiscoverMagazine.com and action may be required on your account. Learn More

Facebook Experiments on Users, Faces Blowback

The social network finds that toying with emotions can be dangerous.

By Michael Fitzgerald
Nov 26, 2014 6:00 AMNov 12, 2019 6:22 AM
Facebook.jpg
Ted Soqui/Corbis

Newsletter

Sign up for our email newsletter for the latest science news
 

People don’t often get riled up about research, but when Facebook toyed with its members’ emotions without telling them, it stirred up plenty of feelings offline as well.

In June, researchers revealed in the Proceedings of the National Academy of Sciences that they had manipulated news feeds of unsuspecting Facebook users, influencing whether they felt positive or negative emotions. News of the experiment angered scores of users and privacy advocates. Before long, the journal backpedaled from its decision to publish the study with an “Editorial Expression of Concern,” admitting that participants may not have known they were guinea pigs and did not get the chance to opt out.

The experiment, conducted over one week in January 2012, looked at “emotional contagion” — whether user emotions would affect other users’ emotions online, as they do in person. Facebook used an algorithm to weed out posts with positive or negative words in the news feeds of 690,000 of its 1.3 billion users. Users who saw positive posts were more likely to post positive things than users who saw negative posts.

Privacy advocates demanded that the Federal Trade Commission investigate Facebook’s research practices since it had experimented on human subjects without their permission. The FTC, per its policy, declined to comment on whether they were looking into it. But before the kerfuffle, Facebook quietly changed its data use policy to allow research on people — four months after finishing the experiment.

A university institutional review board probably would have signed off on Facebook’s experiment, says Harvard Business School’s Michael Luca, who has received approval to conduct similar covert experiments with other companies’ websites. That’s because informed consent is not needed for research that subtly manipulates people without overtly lying to them and causes only minor harm, he says.

Months after the news broke, Facebook said it would step up training for its scientists and have its senior researchers scrutinize proposed studies on content “that may be considered deeply personal,” according to a blog post by Mike Schroepfer, Facebook’s chief technology officer. “We were unprepared for the reaction the paper received when it was published and have taken to heart the comments and criticism.”

1 free article left
Want More? Get unlimited access for as low as $1.99/month

Already a subscriber?

Register or Log In

1 free articleSubscribe
Discover Magazine Logo
Want more?

Keep reading for as low as $1.99!

Subscribe

Already a subscriber?

Register or Log In

More From Discover
Recommendations From Our Store
Shop Now
Stay Curious
Join
Our List

Sign up for our weekly science updates.

 
Subscribe
To The Magazine

Save up to 40% off the cover price when you subscribe to Discover magazine.

Copyright © 2024 Kalmbach Media Co.