People don’t often get riled up about research, but when Facebook toyed with its members’ emotions without telling them, it stirred up plenty of feelings offline as well.
In June, researchers revealed in the Proceedings of the National Academy of Sciences that they had manipulated news feeds of unsuspecting Facebook users, influencing whether they felt positive or negative emotions. News of the experiment angered scores of users and privacy advocates. Before long, the journal backpedaled from its decision to publish the study with an “Editorial Expression of Concern,” admitting that participants may not have known they were guinea pigs and did not get the chance to opt out.
The experiment, conducted over one week in January 2012, looked at “emotional contagion” — whether user emotions would affect other users’ emotions online, as they do in person. Facebook used an algorithm to weed out posts with positive or negative words in ...