Facebook Successfully Controls User Emotions In Controversial Study
Hey, did you play Watch Dogs? There’s a minor spoiler in the next paragraph, so skip it if you want. It’s not big enough to spoil your enjoyment of the game.
In Watch Dogs there’s a MacGuffin called the “bellwether.” The evil corporation in the game isn’t content to just spy on everyone and everything, they want to use that gathered info to manipulate future events for their gain. And they do, they take their knowledge to feed certain messages to people and very subtly control a couple of key events in the game. A bellwether is, by the way, literally the leading sheep of a flock, with a bell tied around its neck.
Ha ha, you say. Just a stupid far-fetched thing in a stupid game’s idiotic b-movie quality storyline. And I might have agreed, if not for the fact that Facebook just proved it could do the same thing.
Facebook and researchers from Cornell University and the University of California at San Francisco wanted to see if they could influence users’ emotions by controlling the content Facebook exposed them to, without their knowledge, a process the study calls “emotional contagion” because that doesn’t sound ominous at all.
Nearly 700,000 Facebook users were involved in the study, without any specific consent other than what they agreed to when they signed up for Facebook.
Unlike some social networks such as Twitter where users can choose what they want to see, Facebook wields control over what messages from your friends, groups, and fan page feeds it shows us in our news feeds and which ones it doesn’t. Usually these are controlled by a nearly impossibly complex impersonal algorithm that seeks to find the content most interesting to you, but apparently it can also be manually controlled.
As part of the study Facebook intentionally showed some users more negative stories in their news feeds for one week in 2012, to see if that would make them sad and make them post more negative posts. It did.
Users who were exposed to more positive posts did not get so blue. The researchers didn’t actually snoop on the posts. Whether they were positive or negative was determined by a computer program that searched for positive or negative words.
According to the study “for people who had positive content reduced in their News Feed, a larger percentage of words in people’s status updates were negative and a smaller percentage were positive. When negativity was reduced, the opposite pattern occurred. These results suggest that the emotions expressed by friends, via online social networks, influence our own moods, constituting, to our knowledge, the first experimental evidence for massive-scale emotional contagion via social networks, and providing support for previously contested claims that emotions spread via contagion through a network.”
What, if any, lasting impact did having the worst digital week ever have on their lives? Did any of them jump off a bridge or something? Who knows, wasn’t part of the study.
Facebook’s control over its users’ emotions, of course, is only made possible by the deep access we willingly give it to our daily lives. But lots of ethics watchdogs (not the video game kind) immediately cried foul when the study was released, pointing out that even though we technically agree that Facebook can use our info for research, we probably don’t expect to be treated like lab rats.
Some were worried about what else Facebook might do with such power. According to the BBC, Labour MP Jim Sheridan, a member of the Commons media select committee has called for an investigation into the matter.
“This is extraordinarily powerful stuff and if there is not already legislation on this, then there should be to protect people,” he said.
“They are manipulating material from people’s personal lives and I am worried about the ability of Facebook and others to manipulate people’s thoughts in politics or other areas. If people are being thought-controlled in this kind of way there needs to be protection and they at least need to know about it.”
Jaron Lanier, in an op-ed for the New York Times, took issue with secretly experimenting on people through social media, and said reform is necessary.
“It is unimaginable that a pharmaceutical firm would be allowed to randomly, secretly sneak an experimental drug, no matter how mild, into the drinks of hundreds of thousands of people, just to see what happens, without ever telling those people. Imagine a pharmaceutical researcher saying, ‘I was only looking at a narrow research question, so I don’t know if my drug harmed anyone, and I haven’t bothered to find out.’ Unfortunately, this seems to be an acceptable attitude when it comes to experimenting with people over social networks. It needs to change,” Lanier wrote.
Lanier also worried that other elements of daily life could be manipulated using these techniques.
“Research has also shown that voting behavior can be influenced by undetectable social network maneuvering, for example,” he wrote.
Of course there are other people who called it no big deal, including Facebook, which offered this mild “my bad” through employee Adam Kramer, who coauthored the research.
“We felt that it was important to investigate the common worry that seeing friends post positive content leads to people feeling negative or left out. At the same time, we were concerned that exposure to friends’ negativity might lead people to avoid visiting Facebook.”
However, he admitted that the firm did not “clearly state our motivations in the paper”.
“I can understand why some people have concerns about it, and my co-authors and I are very sorry for the way the paper described the research and any anxiety it caused.”
So this should be the part where I make an objective, balanced closing statement that gives equal credence to both sides and then some kind of “wait and see” blather.
But screw that. I’m a blogger, not a journalist, and this kind of thing is wrong and should be stopped.
We have to demand that social networks are treat us ethically as they become a more important part of our lives, and that includes not being experimented on or manipulated. I’m not saying to stop using Facebook over this yet, but let them know this is the kind of thing you don’t like and don’t want. And if they don’t respond, find other social media options.
I wonder if I can remember my Myspace password?