Stock Markets May 15, 2026 06:33 AM

X Agrees Faster Reviews for Illegal Hate and Terror Content in U.K., Regulator Says

Ofcom secures commitments on review timelines, access restrictions and quarterly reporting after months of regulatory scrutiny

By Priya Menon

Britain's communications regulator, Ofcom, said X has committed to faster review times for suspected illegal hate speech and terrorist content in the U.K., restrictions on accounts tied to proscribed organizations, and quarterly performance reporting over the coming year. The agreement follows months of regulatory pressure and responds to concerns raised by civil society groups about the handling of flagged posts.

X Agrees Faster Reviews for Illegal Hate and Terror Content in U.K., Regulator Says

Key Points

  • Ofcom secured X's commitment to review suspected illegal hate and terrorism-related posts within 24 hours on average and to assess at least 85% within 48 hours - impacts social media and technology sectors.
  • X will restrict access in Britain to accounts run by or on behalf of organizations banned under U.K. terrorism laws and will submit quarterly performance data to Ofcom over the next year - affects regulatory compliance and digital communications.
  • Platform will engage external experts to improve reporting systems following concerns from civil society that flagged content was not always clearly received or acted on - relevant to moderation operations and trust in online platforms.

Ofcom announced on Friday that X has agreed to tighten protections for users in the United Kingdom against illegal hate speech and content linked to terrorism. The regulator said the commitments come after several months of sustained regulatory pressure on the social media platform.

Under the terms outlined by Ofcom, X will aim to review suspected illegal posts related to hate or terrorism within 24 hours on average, and will ensure that at least 85% of such posts are assessed within 48 hours. The regulator framed these targets as a measurable timetable for handling potentially unlawful material.

In addition to the review timelines, Ofcom said X pledged to restrict access from within Britain to accounts that are operated by or on behalf of organizations proscribed under U.K. terrorism laws. The company also agreed to provide quarterly performance data to Ofcom for the next year, enabling the regulator to monitor compliance and progress against the commitments.

Ofcom added that X will work with external experts to improve its reporting systems. This step addresses concerns raised by civil society groups, which had flagged that reports of problematic content were not always clearly received or followed up on by the platform.

Oliver Griffiths, director of Ofcom's online safety group, said the regulator has evidence that terrorist content and illegal hate speech continue to appear on some of the largest social media services. He emphasized the importance of the issue in the U.K., noting it follows a number of recent hate-motivated crimes affecting the country's Jewish community.

X, which regularly states that it enforces bans on terrorist groups and hateful content, did not immediately respond to a request for comment about the commitments described by Ofcom.

The package of measures agreed with Ofcom centers on three operational elements: faster review metrics for flagged material, geographically targeted restrictions on accounts linked to proscribed organizations, and enhanced transparency through quarterly reporting. The commitment to engage outside expertise targets reported shortcomings in the platform's intake and handling of flagged content.

While Ofcom framed these actions as steps to strengthen user protections, the regulator will receive the quarterly performance reports over the next year to evaluate whether X implements the agreed changes effectively.

Risks

  • Persistent presence of terrorist content and illegal hate speech on major social platforms, as noted by Ofcom, presents ongoing enforcement and safety challenges - impacts social media and public safety oversight.
  • Civil society groups raised concerns that reports of problematic content were not always clearly received or acted upon, indicating potential gaps in the platform's reporting and moderation processes - affects platform operations and compliance.
  • The effectiveness of the agreed measures will depend on implementation and monitoring; quarterly reporting over the next year will determine whether the commitments translate into sustained change - relevant to regulatory and compliance risk.

More from Stock Markets

Barclays Heat Map Suggests UK Energy Shock May Echo 2011, Not 2022 May 15, 2026 Figma Rallies After Strong Q1 Results and Upsized Guidance May 15, 2026 Gemini Space Station Pops After Q1 Beat and $100M Insider Investment May 15, 2026 Papa John’s Stock Rises After Largest U.S. Franchisee Backs Irth Capital Buyout Bid May 15, 2026 Wolfe Research Picks Amazon, DoorDash, Meta and Chewy as Top Internet Ideas May 15, 2026