Last month, Facebook announced that it had conducted an experiment in which it purposely showed a group of users only negative posts from their friends' news feeds. The premise was to test what the academics behind the research of "emotional contagion," the notion that moods can spread across networks. Well, everyone was annoyed at being manipulated, and the lead researcher in the study has apologized. The Electronic Privacy Information Center has asked for an investigation from the FTC, saying Facebook was duplicitous, manipulative, and failed to inform users of the experiment. Now, Maryland Law Professor (and friend of TLDR) James Grimmelmann, along with colleague Leslie Meltzer Henry and the faculty of the Berman Institute of Bioethics at Johns Hopkins University have asked the Proceedings of the National Academy of Sciences to retract the Facebook study.
From the letter:
The sticking point is that Facebook users were involuntarily enrolled in the Facebook Study. They were not notifed of their participation (and have not been to this day); they were not given the opportunity to remove themselves from the experiment. You have written that the research behind the article “may have involved practices that were not fully consistent with the principles of obtaining informed consent.” This is a serious understatement. Te Facebook Study violated broadly accepted norms of research ethics. Its publication violated PNAS’s stated editorial polices. Retraction is the only appropriate response.
Participants were not told (and have not been told) that they were part of a study: no one gave them a point of contact for questions or oﬀered them the ability to opt out. No one obtained specifc consent for the study, let alone signed forms. Most of all, it was reasonably foreseeable that the Facebook Study would cause discomfort to participants. The study was designed to demonstrate that “emotions expressed by friends, via online social networks, infuence our own moods,”25 and the initial hypothesis was that participants in one of the treatment groups would “express increased negativity.”
This isn't the first time Facebook has run afoul of the FTC. In 2011, Facebook agreed to a settlement with the agency after it kept changing its privacy settings, allowing information users had explicitly set as private to be exposed to the world. As part of that settlement, Facebook was barred from " making any further deceptive privacy claims, requires that the company get consumers' approval before it changes the way it shares their data, and requires that it obtain periodic assessments of its privacy practices by independent, third-party auditors for the next 20 years." It's unclear whether this instance violates this 20-year-consent decree.