PROFESSOR JAMES GRIMMELMAN: When you start experimenting on people, you start manipulating their environment to see how they react, you’re turning them into your lab rats.
CHRISTIAN RUDDER: The outrage that greeted that particular experiment far outstripped its practical implications.
DANAH BOYD: Why the controversy blew up at the time and in the way that it did is that we’re not sure we trust Facebook.
LUCKY SEVERSON, correspondent: There were tremors of ethical outrage when a major scientific journal revealed that the social media site Facebook had conducted experiments, altering what customers see on their own pages. The outrage was voiced across all forms of media, both traditional ones and digital outlets like YouTube.
MICHAEL ARANDA (Host of online science program “SciShow” via YouTube): A lot of users somewhat understandably thought it was creepy that Facebook was intentionally trying to manipulate their emotions.
SEVERSON: Social media have grown so fast over the past few years, the biggest of them all, of course, being Facebook with nearly a billion-and-a-half users worldwide, more than 130 million in the U.S. alone. What it knows about us and how it uses that knowledge is troubling to an increasing number of people.
Facebook wanted to find out if emotions are contagious. So the company edited the information that it sends to customers’ pages—their so-called newsfeed—for 700,000 people who were its unknowing test subjects. To see whether a positive, upbeat mood could be contagious, Facebook filtered out information that seemed sad or negative, whether it originated from Facebook friends or from the news in general. And they also filtered out positive messages when trying to set a more negative mood. This is James Grimmelman, professor of law at the University of Maryland.
GRIMMELMAN: The Facebook experiment was explicitly designed to measure whether Facebook could make people happy or sad.
SEVERSON: Happier messages caused users to post happier updates themselves. Sadder posts prompted sadder updates and fewer updates. Unsurprising, you might think. So what’s all the fuss?
GRIMMELMAN: Now imagine: they look at your posts to see whether you use angry words or supportive words in responding to other people’s posts. They could figure out what things really push your buttons and push them hard.
SEVERSON: Danah Boyd is founder of the Data & Society Research Institute.
BOYD: Where the discomfort lies is what does it mean that we can affect a population, by choice, with the kind of services that they use—with or without them knowing about it?
SEVERSON: Christian Rudder is the president of OkCupid, an online dating service. He doesn’t think Facebook has done any harm.
RUDDER: I just don’t see what Facebook does, for example, that’s fundamentally than different than what Fox or MSNBC does.
SEVERSON: He says even traditional and reputable media like the New York Times tinker with their content all the time to attract readers.
RUDDER: They’re trying to figure out news you might like, basically the exact same thing that you see from your friends. They tinker with it all the time, regardless of whether they publish the results.
SEVERSON: Christian Rudder tinkers with his service—like the experiment when OkCupid told individuals they were a good match when in fact they weren’t, something the service’s own blog admitted was a form of human experimentation.
RUDDER: We didn’t like actually make anyone go on a date. We didn’t even make anyone talk. We just threw you up there, said hey, maybe you might like this person. If she clicks on you, sees your picture, sees the words that you’ve written—I mean, the whole idea of what we did got so overblown.
SEVERSON: And guess what? Mismatches connected. That doesn’t mean they stayed connected. Rudder says they don’t keep track of how long matches work.
But some legal experts on the subject, like Professor Grimmelman, are disturbed by the potential of such manipulation.
GRIMMELMAN: Both the Facebook and OkCupid experiments show you the directions in which these things could rapidly get quite worrisome.
SEVERSON: Christian Rudder is not worried. Says he thinks the critical response was excessive, in part because many people don’t yet fully understand the Internet and because social media companies like his own and Facebook are not interested in personal information.
RUDDER: Nobody looks at any individual user’s behavior. You look at things in the aggregate, like what are men into? What do women want from profiles? What do women 18 to 24 click on?
SEVERSON: For ethical guidance, social media companies like Facebook use what are called institutional review boards, IRBs, usually at well-known universities, to oversee their research projects. Facebook was using Cornell for its “emotional contagion” study, but Cornell later said it only received details of the study after it was too far along. Danah Boyd says IRBs can be helpful, but they’re not perfect.
BOYD: Anybody who’s actually worked with an institutional review board in a university setting knows that they’re a nightmare at best, but the values and goals and efforts that you’re trying to achieve there are really important.
SEVERSON: She says regulation won’t solve the problem—that the solution begins with greater ethical education in a technological field that has been moving so fast ethical considerations haven’t caught up.
BOYD: I’m more concerned about how you get engineers to be thinking about ethical decisions and what it means to be training engineers from the get-go to really think about ethics.
SEVERSON: This is not the first experiment Facebook has conducted. In 2010, it divided 61 million American users into three groups and showed them each a different nonpartisan get-out-the-vote message. Turns out some messages did increase voter turnout.
GRIMMELMAN: They could do it in a way that would possibly swing an election, and unlike blanketing an area with ads this would be invisible.
RUDDER: There’s no dark story here. It’s just new technology that I think people need to continue to get familiar with. I know on a firsthand basis that certainly at OkCupid there’s no desire to harm anyone at all. But that might not be the case at some other places. I have no idea.
GRIMMELMAN: I think it’s pretty arrogant to say, “I’m a good guy. You can trust me. In fact, I’m so good I’m not even going to let you make that choice.”
SEVERSON: Users agree to Facebook’s research when they sign the terms of service, and they cannot join until they do. But the terms of service or consent form is 9,000 words long and only uses the word “research” twice.
BOYD: I think that one’s about a 40-minute read, and it requires a level of legal knowledge to understand what a lot of these terms-of-art even mean. The other thing is that when people are joined into a site like Facebook they’re not thinking, “Hmmm, do I like or not like Facebook? Do I like or not like their terms?” They’re thinking, “My friends are there, and I want to hang out with my friends.
SEVERSON (to Rudder): You don’t think that they should have to give their explicit permission?
RUDDER: I don’t think—I personally do not think so.
SEVERSON: But calls for greater safeguards continue, and it may be that general unease about the power of big social media companies will end up having greater impact than the findings of any specific experiment, including the “emotional contagion” study.
BOYD: I don’t think it’s really about the study itself. I think it’s about all of the things that surround it and the uncertainty and distrust and discomfort by this whole Big Data phenomenon.
SEVERSON: Facebook eventually apologized—not for the research study itself, which chief operating officer Sheryl Sandberg said is ongoing, but for the way news about the study was communicated. The company has also agreed to start a stronger internal review process.