Even The Editor Of Facebook’s Mood Study Thought It Was Creepy
Catching a glimpse of the puppet masters who play with the data trails we leave online is always disorienting.
But why? Psychologists do all kinds of mood research and behavior studies. What made this study, which quickly stirred outrage, feel so wrong?
Even Susan Fiske, the professor of psychology at Princeton University who edited the study for Proceedings of the National Academy of Sciences of America, had doubts when the research first crossed her desk.
“I was concerned,” she told me in a phone interview, “until I queried the authors and they said their local institutional review board had approved it—and apparently on the grounds that Facebook apparently manipulates people’s News Feeds all the time… I understand why people have concerns. I think their beef is with Facebook, really, not the research.”
Institutional review boards, or IRBs, are the entities that review researchers’ conduct in experiments that involve humans. Universities and other institutions that get federal funding are required to have IRBs, which often rely on standards like the Common Rule—one of the main ethical guideposts that says research subjects must give their consent before they’re included in an experiment. “People are supposed to be, under most circumstances, told that they’re going to be participants in research and then agree to it and have the option not to agree to it without penalty,” Fiske said. (I emailed the study’s authors on Saturday afternoon, but haven’t reached them yet.)
But Facebook, as a private company, doesn’t have to agree to the same ethical standards as federal agencies and universities, Fiske said.
“A lot of the regulation of research ethics hinges on government supported research, and of course Facebook’s research is not government supported, so they’re not obligated by any laws or regulations to abide by the standards,” she said. “But I have to say that many universities and research institutions and even for-profit companies use the Common Rule as a guideline anyway. It’s voluntary. You could imagine if you were a drug company, you’d want to be able to say you’d done the research ethically because the backlash would be just huge otherwise.”
The backlash, in this case, seems tied directly to the sense that Facebook manipulated people—used them as guinea pigs—without their knowledge, and in a setting where that kind of manipulation feels intimate. There’s also a contextual question. People may understand by now that their News Feed appears differently based on what they click—this is how targeted advertising works—but the idea that Facebook is altering what you see to find out if it can make you feel happy or sad seems in some ways cruel.
Mood researchers have been toying with human emotion since long before the Internet age, but it’s hard to think of a comparable experiment offline. It might be different, Fiske suggests, if a person were to find a dime in a public phone booth, then later learn that a researcher had left the money there to see what might happen to it.
“But if you find money on the street and it makes you feel cheerful, the idea that someone placed it there, it’s not as personal,” she said. “I think part of what’s disturbing for some people about this particular research is you think of your News Feed as something personal. I had not seen before, personally, something in which the researchers had the cooperation of Facebook to manipulate people… Who knows what other research they’re doing.”
Fiske still isn’t sure whether the research, which she calls “inventive and useful,” crossed a line. “I don’t think the originality of the research should be lost,” she said. “So, I think it’s an open ethical question. It’s ethically okay from the regulations perspective, but ethics are kind of social decisions. There’s not an absolute answer. And so the level of outrage that appears to be happening suggests that maybe it shouldn’t have been done…I’m still thinking about it and I’m a little creeped out, too.”