Ofcom dropped its first £1 million hammer on a porn site on December 4th, punishing it for failing to verify users’ ages, and headlines exploded. Finally, many thought, the UK’s Online Safety Act is doing something tangible. A dangerous website was punished! Children protected! Moral victory declared!
Well… not quite.
In fact, the Act just won the easy battle: the low-hanging, publicly visible, adult-site fruit. Meanwhile, the real war—the one involving children, predators, encryption, and the limits of the law—is raging behind digital closed doors.
Welcome to the part of the Online Safety Act that the government would rather you didn’t think too hard about.
What It Is: The Online Safety Act 2023 is comprehensive UK legislation regulating online platforms, particularly social media and search engines, to protect children and adults from harmful content.
Timeline:
- October 26, 2023 – Received Royal Assent, became law.
- July 2024 – Age verification duties for pornography sites came into force.
- March 17, 2025 – Illegal content duties become enforceable (pending parliamentary approval).
- July 25, 2025 – Child protection duties came into force.
- 2026 – Expected full implementation.
The Encrypted Messaging Loophole Nobody Wants to Own
Recent reporting showed that UK child-protection charities are panicking over what they call an enormous loophole in the Online Safety Act.
In short:
Encrypted messaging services can claim they can’t remove harmful content because… they literally can’t see it.
This isn’t a bug; it’s the whole point of end-to-end encryption (E2EE). Only the sender and receiver can view messages. Not the platform. Not the government. Not even Ofcom with the biggest legal crowbar available.
That’s amazing for privacy. That’s catastrophic for child protection.
Under the Act, platforms must remove illegal content when it is “technically feasible.” And if your entire business model is “we cannot technically read anything you send”—well, congratulations. You just found the Act’s soft underbelly.
In a letter to Home Secretary Yvette Cooper and Technology Secretary Peter Kyle, charities like NSPCC and Barnardo’s warn that this creates a digital safe harbor for abusers, especially those grooming children or sharing illegal material via private chats, closed groups, or ephemeral channels.
In other words:
Encrypted messaging apps may never have to remove child-abuse content because the law allows them to claim they can’t.
Why This Isn’t a Tech Problem; It’s a Policy Time Bomb
The government wanted two things:
- Unbreakable privacy (so people feel secure using messaging apps).
- Unbreakable child protection (so parents and charities stop shouting at them).
The problem? You can pick one. You cannot pick both. Not without inventing magic.
Ofcom’s Illegal Harms Codes of Practice, published on December 16, 2024, tried to thread the needle by allowing obligations “where feasible,” but that simply puts the problem back to the platforms.
Encrypted platforms say: “We can’t scan messages — that breaks encryption.”
Charities say: “Then predators will hide there.”
Government says: “…Have you tried turning it off and on again?”
Meanwhile, Ofcom’s fine for porn-site is paraded around like proof that the Online Safety Act is a modern miracle, despite the fact that public adult websites were never the hardest part of this law.
The difficult part is this: How do you stop criminals in encrypted spaces without watching everyone, all the time?
And nobody has given a satisfying answer.
What Makes Encryption Such a Nightmare for Regulators?
-
End-to-End Encryption Is Designed to Lock Everyone Out
Platforms don’t have access to message contents. This is a feature, not a flaw.
-
Scanning Encrypted Messages = Breaking Encryption
Any “content detection” system requires:
- breaking encryption,
- scanning on-device before encryption (client-side scanning), or
- inserting government-approved backdoors.
All three options terrify privacy experts, and rightly so.
-
Predators Exploit the Blind Spot
Private groups and encrypted DMs are prime channels for grooming, extortion, CSAM sharing, trafficking networks.
Charities warn that the Act’s ambiguity lets platforms avoid accountability exactly where the risk is highest. The Internet Watch Foundation has called the current wording a “blatant get-out clause” that may enable platforms to sidestep compliance with online safety laws.
-
The Law Pretends It’s Solved This (It Hasn’t)
Politicians talk about being “tough on tech companies,” but the real conflict — privacy vs. protection — is unresolved and probably technically impossible to solve without collateral damage.

The Scale of the Problem
The statistics paint a disturbing picture:
The Internet Watch Foundation reports that in 2024 alone, they acted to remove images or videos of children suffering sexual abuse on 291,270 webpages—the most in the organization’s 29-year history. This represents an 830% increase since they began proactively hunting child sexual abuse imagery in 2014.
The National Crime Agency estimates there are between 680,000 and 830,000 UK-based adult offenders who pose varying degrees of risk to children—equivalent to 1.3% to 1.6% of the UK adult population.
Meanwhile, the NSPCC reports that Snapchat is the platform that comes up most often in grooming cases.

The Questions That Stay Unanswered
Don’t get distracted by the £1 million headlines. The porn-site crackdown is the shiny surface. The encrypted-messaging loophole is the sinkhole underneath.
If the UK wants to enforce meaningful child protection, it must answer questions lawmakers have been dodging for years:
- Should encryption be weakened in the name of safety?
- Should companies be required to scan messages on devices?
- Should we accept that some online spaces will always be opaque?
- Or should the government admit the Online Safety Act cannot do what it promises?
Until those questions are answered, we’re left with a paradox: The safest digital spaces for adults may be the most dangerous ones for children, and the law, as written, cannot resolve that contradiction.
A Global Problem, Not Just a UK One
This isn’t Britain’s problem alone. The encryption paradox is playing out on multiple continents simultaneously, and the stakes are global.
In the European Union, the controversial “Chat Control” proposal has been debated for years. After intense pushback, EU member states reached a political compromise in November 2025 that dropped mandatory scanning of encrypted communications, but critics warn the “voluntary” approach could still create indirect pressure on platforms. Signal’s president Meredith Whittaker has warned the company would leave the EU market rather than compromise encryption.
In the United States, multiple bills threaten encryption under the banner of child protection. The EARN IT Act (Eliminating Abusive and Rampant Neglect of Interactive Technologies Act) has been introduced three times since 2020, each time facing fierce opposition from privacy advocates who argue it would force platforms to abandon end-to-end encryption to avoid liability.
The STOP CSAM Act (S. 1829), introduced in 2023 and reintroduced in May 2025, creates civil liability for platforms that “promote or facilitate” child exploitation—language critics say could punish encrypted services for simply existing. Senator Ron Wyden has repeatedly opposed these bills, warning they would force companies to weaken encryption.
The pattern is unmistakable: democracies worldwide are grappling with the same impossible equation. Protect children or protect privacy. Choose safety or choose security. Break encryption or accept blind spots.
And everywhere, the answer is the same: There is no good answer.
When the UK debates this issue, the rest of the world watches. When the EU compromises, it sets precedents. When the US passes legislation, tech companies operating globally must adapt. The Online Safety Act’s encryption loophole isn’t just a British policy failure—it’s a preview of a crisis that no democratic government has figured out how to solve.
The question isn’t whether other countries will face this dilemma. They already are. The question is whether any of them will find a solution that doesn’t sacrifice either children’s safety or everyone’s security.
So far, the answer appears to be no.
Leave a Comment