Facebook has created a lot of noise this week, a lot of quite negative noise. The reason is the publication in the June edition of the Proceedings of the National Academy of Science of a study name “Emotional Contagion” by Adam Kramer, a data scientist at Facebook, Professor Jeffrey Hancock from University of California and Doctoral student Jamie Guillory from Cornell University. The title “emotional contagion” is clearly a bit worrying when we know the kind of data Facebook owns. And to be honest, the reality is even worse.
During one week in January 2012, 700,000 Facebook users (0.053% of the 1.3billion) had been given access to an altered and manipulated feed. Some were shown “happy feeds” that used happier language than usual, while a second group of users were shown a “neutral feed” and the third group of about 155,000 users had been given access to a “sad feed” that used sad language. Pretty simple! The motivation to launch such a study on such a big group of people was to see if positive, neutral or negative status updates had any impacts on users’ emotions and reactions. Conclusion: it did and it seems that “the contagion is massive.”
Reactions to the study were all “massive” and easily spread out with the help of Facebook (see the irony). How can Facebook, the owner of our personal data, launch such a study that has manipulated hundreds of thousands of its users without their consent or knowledge? That question has sparked negative reactions all over the world. And we just jump from worrying to shocking.
Journalists, social researchers and basically everyone has now heard about the study and simply question Facebook’s experience with its users. The ethical question has also been raised by legal experts and researchers:
On Wednesday, Facebook’s second-in-command, Sheryl Sandberg, expressed regret over how the company communicated its 2012 mood manipulation study of 700,000 unwitting users, but she did not apologize for conducting the controversial experiment.
And in the end Facebook does not risk anything anymore. Indeed four months after the study was held, the company changed its data use policy to explain that the company can use users’ data “for internal operations, including troubleshooting, data analysis, testing, research and service improvement.” And this is something we all sign up for when we register to Facebook and accepted the Terms and Conditions. And here is the consent Facebook needed to justify such study. By using Facebook we consent to all of their actions and this is something we tend to forget hence the important of reactions. We might not be aware of what we sign up for but one thing is sure: we consent to it.