Amnesty International: Obstacles to Autonomy 

By Jane Eklund

In 2022, the U.S. Supreme Court’s decision on Dobbs v. Jackson Women’s Health Organization stripped away federal abortion protections and kicked the issue of abortion rights to individual states. During this time, more people than ever turned to the internet to search for information on abortion. However, U.S. reproductive health and rights organizations working to ensure people could access care and resources online found that the information they shared quickly faced removal on social media platforms. Over two years later, this issue persists.

Content removals 

Instances of abortion content being removed from social media post-Roe often have inadequate justification or appear to be a misapplication of platform guidelines. Information on medication abortion and how to access it is reported to be the content removed most frequently by social media platforms. While some organizations only face issues with their content being removed, others have also had their accounts temporarily suspended for supposedly violating community guidelines (sometimes without ever being told which guidelines they violated). Advocacy organizations, telehealth abortion providers, and reproductive health nonprofits have sought greater transparency when it comes to how platforms moderate abortion content, but many have remained in the dark about why their content or accounts have been taken down or temporarily suspended. 

Big Tech Business and Human Rights Responsibilities 

The UN Guiding Principles on Business and Human Rights underscore the responsibilities of social media companies to neither cause nor contribute to human rights abuses through their activities and to address impacts in which they are involved. When social media companies fail to uphold these principles, they risk infringing upon the rights of their users to access healthcare information, which contributes to the threats to reproductive rights imposed on users, particularly those living in places that restrict access to reproductive healthcare. 

The Path Forward 

Stronger transparency around community guidelines and content moderation practices is essential to ensure accountability and to prevent arbitrary removal of vital reproductive health and rights information. To address this issue, social media companies companies should: 

  • Be more transparent about how their community guidelines apply to abortion content

  • Improve transparency on the use of recommendation systems and content moderation algorithms

  • Proactively identify, prevent, and address any harms arising from their content moderation and the potential suppression of abortion-related content. 

Read Amnesty International’s full research report here

Previous
Previous

Reproaction: Sample Artificial Intelligence Policy

Next
Next

Women on Web and Plan C: Bings’ Typo-Searching for Abortion