Key Facts
- Fine Amount: Ofcom fined 4chan £20,000 ($26,700) for non-compliance with the UK’s Online Safety Act.
- Regulatory Actions: This marks the first fine issued by Ofcom under the new online safety regulations.
- Legal Dispute: 4chan previously filed a lawsuit against Ofcom, claiming the Act infringes on free speech rights.
- Ongoing Penalties: 4chan faces an additional daily fine of £100 ($133) until it complies with information requests.
Ofcom has slapped 4chan with a £20,000 ($26,700) fine for failing to comply with the internet and telecommunications regulator’s request for information under the UK’s Online Safety Act of 2023. The regulator has released an update for 11 of the investigations it opened after the first of its online safety codes became enforceable in March this year. Apparently, 4chan has ignored its requests for a copy of its illegal harms risk assessment and to provide information about its qualifying worldwide revenue. This is the first fine Ofcom has handed down under the new law, which was designed to prevent children from accessing harmful content online and which has prompted websites like Reddit and X to put up age verification measures.
What prompted Ofcom’s investigation into 4chan?
In June, Ofcom launched an investigation into 4chan following complaints regarding illegal content on the site. This investigation resulted from multiple reports about the risks associated with 4chan’s platform.
Ofcom’s probe highlights concerns regarding online safety and the responsibility of platforms to manage harmful content.
What are the implications of 4chan’s fine?
This fine serves as a warning to other online platforms about the importance of compliance with online safety regulations. It underscores the necessity for companies to take proactive measures in removing illegal content.
Ofcom’s actions reflect a broader regulatory push to ensure that digital environments are safe for users, particularly children.
How have other platforms responded to similar investigations?
Ofcom has also identified “serious compliance concerns” with two file-sharing services, which have since implemented automated tools to detect and remove uploads containing child sexual abuse material (CSAM). In contrast, four other investigated services opted to geoblock access from UK IP addresses.
This indicates a varied response among platforms regarding compliance with safety regulations.









