Recent cases—including Sri Lanka, Burma, India, and Ethiopia—have raised alarm surrounding social media’s influence on large-scale, group-targeted violence. From spreading disinformation and hate speech, to helping perpetrators target people associated with certain groups or opinions, to enabling coordination of mob attacks, there seem to be multiple ways social media platforms—such as Twitter, Facebook, and Instagram—could fuel or exacerbate atrocity risks.
However, many questions remain: What are the most important mechanisms connecting social media to mass atrocity risks? How is social media affecting countries at greatest risk of mass atrocities? What can be done to mitigate atrocity risks posed by social media?
On January 19 and 24, the Simon-Skjodt Center for the Prevention of Genocide brought together a diverse group of scholars, representatives of social media companies, practitioners, and policy makers for the 2023 Sudikoff Interdisciplinary Seminar on Genocide Prevention. The discussions focused on social media platforms, the risks of mass atrocities, and opportunities for atrocity prevention.
The seminar aimed to take stock of knowledge about the relationship between social media and mass atrocities, generate new ideas for future research or practice, and explore how the Simon-Skjodt Center can play a constructive role in this area. Here we distill the seminar’s key themes, which we elaborate on in a background paper and a rapporteur’s report.
Several participants voiced the tension between the need for more research and the need for action—reflecting a recurring tension in the broader atrocity prevention field. However, participants generally agreed that while researchers need more data to better understand the relationship between social media and mass atrocities, this should not prevent responses to present social media risks. Instead, research and policy responses could simultaneously reinforce one another.
Ideas for future research
Participants called for expanded interdisciplinary research that focuses on:
building consensus about the potential mechanisms by which social media influences mass atrocity risks, considering social media and its effects on mass atrocity risks are likely to vary across high-risk contexts;
identifying which actors might use social media platforms to help commit mass atrocities or accelerate the risks of mass atrocities, which may assist with early warning for mass atrocities by focusing on potential perpetrators and their incentives;
differentiating social media users more extensively, since the ways social media users might encourage violence or contribute to risks will likely play out differently among audiences such as the general public compared to military personnel; and
addressing risks posed by social media beyond those caused by hate speech, such as how “Friends” lists could aid perpetrators to target groups of civilians, and risks from less extreme forms of dangerous speech that could cause offline harm but are difficult to detect.
Additionally, participants indicated it could be useful to consider, in advance of a crisis, the tools available to address potential harms posed by social media and the research or evidence that would be deemed sufficient to justify the use of different tools.
Potential opportunities for atrocity prevention
Participants identified potential opportunities for preventing mass atrocity risks linked to social media, including the following recommendations:
Social media companies and researchers should seek to build healthy social media ecosystems that support communities by fostering constructive and safe interaction for all users.
Civil society groups should create and share a playbook documenting best practices, response strategies, associated risks, and examples for local actors to address harm linked to social media at different stages.
Social media companies should advance their due diligence efforts before entering markets in at-risk environments. This requires, as one participant indicated, that social media companies consider the human rights effects if everyone everywhere has access to their products.
Social media companies and researchers should strengthen responsible product design and address potential mass atrocity risks associated with product design.
Social media companies should share relevant data with United Nations investigative mechanisms and international courts and work to preserve evidence relevant to mass atrocity cases to ensure it can be used in legal settings where applicable.
Government actors should consider changes in government regulation, such as adding criminal and legal incentives, to promote a “duty of care” standard among social media companies.
Participants agreed research on prevention approaches should involve bridging the gap between local actors and social media companies in at-risk environments, especially once a crisis begins. They also agreed it is important to account for potential and harmful unintended consequences.
Additionally, participants suggested that better collaboration between social media companies, researchers, and civil society groups could strengthen efforts to prevent potential social media-related harm.