Social media company Meta has this year removed or severely restricted more than 50 accounts linked to abortion‑access providers, reproductive health organisations and queer groups across Facebook, Instagram and WhatsApp, prompting campaigners to call the action one of the most significant censorship waves on the platforms in recent years. According to the original report, the restrictions began in October and affected groups in Europe and the UK as well as organisations in Asia, Latin America and the Middle East.

Industry monitoring by Repro Uncensored shows a sharp rise in enforcement incidents: the NGO recorded 210 incidents of removal or severe restriction in 2025, up from 81 the previous year. Campaigners say the scale of the crackdown has cut off vital support networks that rely on Meta’s services to offer sexual and reproductive health information and LGBTQ+ community outreach.

Specific cases underline the human consequences. The Guardian reported that longstanding pages such as Women Help Women, which used Facebook for 11 years , were banned, with the company citing alleged breaches related to prescription drugs; the group’s executive director, Kinga Jelinska, warned that the blackout could be life‑threatening by pushing people towards unsafe sources. An Amsterdam‑based queer account, The Queer Agenda, was among those removed in the same wave. Campaigners in Colombia also described repeated blocking and reinstatement of WhatsApp helplines, leaving providers unable to plan services reliably.

Meta has denied it is targeting particular communities. The company said every organisation is subject to the same rules and rejected claims that enforcement was based on advocacy or group affiliation, adding that its policies on abortion content have not changed. Meta also characterised some of the hashtag restrictions that briefly hid LGBTQ+ tags as a 'mistake' and reversed those changes after public scrutiny. According to the original report, however, activists argue the pattern mirrors a US‑centric approach to women’s health and LGBTQ+ issues that is now being exported globally.

Investigations by news agencies and digital‑rights groups point to a deeper problem with enforcement rather than policy change alone. Reporting for the Associated Press in May and September 2025 documented multiple cases where legally compliant, informational posts about abortion and reproductive health were removed or accounts suspended even in jurisdictions where abortion is legal. Experts cited over‑enforcement by AI moderation systems and reduced human review as a likely cause, noting an uptick in takedowns of medically accurate, potentially life‑saving information. The Electronic Frontier Foundation and other monitors have flagged the chilling effect and the opaque nature of appeals.

Campaigners say coordinated reporting by anti‑abortion actors and the limitations of automated moderation amplify the risk. Reproductive health organisations such as MSI Reproductive Choices have previously argued that platforms are more likely to remove advertising or posts from local providers while failing to act on misinformation that undermines care. In practice, some groups have resorted to creating backup accounts, adopting coded language, or migrating to alternative channels to preserve access to services.

Beyond outright bans, shadow‑banning and the labelling of common LGBTQ+ hashtags as 'sensitive content' have hindered discoverability for younger users and marginalised voices. The original reporting observed that teenagers using default filters encountered blank search results for tags such as #gay, #lesbian, #trans and #queer, revealing how quickly community visibility can be toggled off. Meta says it aims to reduce enforcement mistakes and acknowledges that appeals have become slower, but campaigners remain frustrated by vague explanations and limited cooperation, including closed‑door briefings described as non‑consultative. Source: Noah Wire Services