Yesterday, Facebook CTO Mike Schroepfer announced updates to the way that Facebook conducts research. This comes on the heels of the overwhelmingly negative reaction to a study where the company deliberately manipulated 700,000 newsfeeds to either increase or reduce the number of positive status updates users would see in their feeds.
Critics saw the attempt to manipulate the psychological status of users without informed consent a transgression of ethical lines that have been well drawn in the scientific community. University of Maryland law and technology professor James Grimmelmann, one of the most outspoken critics of the tactic, asked the Proceedings of the National Academy of Sciences to retract the Facebook study.
Schroepfer, in a post on the Facebook newsroom blog wrote "although this subject matter was important to research, we were unprepared for the reaction the paper received when it was published and have taken to heart the comments and criticism." He then outlined the changes the company plans to make:
- Guidelines: we’ve given researchers clearer guidelines. If proposed work is focused on studying particular groups or populations (such as people of a certain age) or if it relates to content that may be considered deeply personal (such as emotions) it will go through an enhanced review process before research can begin. The guidelines also require further review if the work involves a collaboration with someone in the academic community.
- Review: we’ve created a panel including our most senior subject-area researchers, along with people from our engineering, research, legal, privacy and policy teams, that will review projects falling within these guidelines. This is in addition to our existing privacy cross-functional review for products and research.
- Training: we’ve incorporated education on our research practices into Facebook’s six-week training program, called bootcamp, that new engineers go through, as well as training for others doing research. We’ll also include a section on research in the annual privacy and security training that is required of everyone at Facebook.
- Research website: our published academic research is now available at a singlelocation and will be updated regularly.
Our interview with OkCupid's Christian Rudder over a similar issue (ignoring the site's matching algorithm and instead deliberately hooking people up at random as a control group to test the algorithm's accuracy) deeply divided our listenership. Some saw it as a clear violation of the way the site was meant to function, while others complained that attempts to restrict this kind of experimentation would stifle any kind of innovation online.
We don't consider ourselves anti-experimentation, but when those experiments have deep psychological or real-world impact, some consideration has to be made for how that might affect the userbase before launching them. While I think Facebook's new operating guidelines are a good first step, there are a couple of noticeable omissions in this note. First, there is no option for people being experimented on to actually know that they are part of an experiment until after the fact. Second, there is no discussion of consent or ethics anywhere in this press release.
Research conducted with Facebook will now all be regularly published at research.facebook.com, so I will be watching that space to see how the site might have manipulated my user experience after the fact, but I would much rather simply have them tell me before hand.