In 2012, a Facebook study made headlines around the world for all the wrong reasons. The study, officially titled “Experimental evidence of massive-scale emotional contagion through social networks,” was conducted by researchers affiliated with the social media giant. It aimed to investigate how manipulating the content of a user’s news feed could affect their emotional state. However, what Facebook did and how they did it raised serious ethical questions, leading to a broader conversation about the responsibilities of tech companies in conducting research on their users.
The Facebook Emotional Manipulation Study, as it is commonly known, involved nearly 700,000 Facebook users as unwitting participants. The study’s methodology was relatively simple: for one week in January 2012, the researchers manipulated the content that appeared on users’ news feeds. Some users had more positive content filtered out, while others had negative content filtered out. The researchers then monitored how the users’ posts were influenced by this content manipulation.
What They Found
The study found that users exposed to more positive content tended to post more positive updates, while those exposed to more negative content posted more negative updates. In other words, they concluded that emotional states could be “contagious” through social networks.
Ethical Problems with the Study
The Facebook Emotional Manipulation Study quickly sparked outrage and concerns from the public and the academic community, primarily due to several ethical issues:
Informed Consent. Participants were not adequately informed about their participation in the study. Users rarely read the lengthy terms of service agreements, but even if they did, they would not have known they were becoming subjects of a psychological experiment.
Psychological Impact. The manipulation of users’ news feeds to induce negative emotions without their consent raised ethical concerns about potential harm to their mental well-being.
Deception. The researchers deceived users by manipulating their online environment without explicit consent or disclosure. Deception in research is generally discouraged, and any deception that occurs must be justified and minimized.
Lack of Oversight. The study was conducted internally by Facebook without external oversight or review, highlighting the lack of transparency in the way social media platforms handle user data and conduct research.
The Facebook Emotional Manipulation Study prompted public outrage and demands for accountability. While it didn’t result in immediate legal consequences or fines for Facebook, it did have some effects:
1. Public Backlash: The study damaged Facebook’s reputation and eroded trust in the platform. Many users felt violated and betrayed by the company.
2. Regulatory Scrutiny: The study drew the attention of regulatory bodies, leading to discussions about the ethical boundaries of research conducted by tech companies on their users.
3. Ethical Guidelines: In response to the backlash, Facebook and other tech companies revised their research policies and ethical guidelines to ensure more transparent and ethical practices.
If you are interested in reading the original journal publication, you can find it here. Join us next month as we explore another instance of unethical research in action.