Some recent commentary on the Online Safety Act seems to treat child protection online as an abstract policy preference. The evidence reveals something far more urgent. By age 11, 27% of children have already been exposed to pornography, with the average age of first exposure at just 13. Twitter (X) alone accounts for 41% of children’s pornography exposure, followed by dedicated sites at 37%.
The consequences are profound and measurable. Research shows that 79% of 18-21 year olds have seen content involving sexual violence before turning 18, and young people aged 16-21 are now more likely to assume that girls expect or enjoy physical aggression during sex. Close to half (47%) of all respondents aged 18-21 had experienced a violent sex act, with girls the most impacted.
When we know that childrens’ accounts on TikTok are shown harmful content every 39 seconds, with suicide content appearing within 2.6 minutes and eating disorder content within 8 minutes, the question is not whether we should act, but how we can act most effectively.
This is not “micromanaging” people’s rights – this is responding to a public health emergency that is reshaping an entire generation’s understanding of relationships, consent, and self-worth.
Abstract arguments about civil liberties need to be set against the voices of bereaved families who fought for the Online Safety Act . The parents of Molly Russell, Frankie Thomas, Olly Stephens, Archie Battersbee, Breck Bednar, and twenty other children who died following exposure to harmful online content did not campaign for theoretical freedoms – they campaigned for their children’s right to life itself.
These families faced years of stonewalling from tech companies who refused to provide basic information about the content their children had viewed before their deaths. The Act now requires platforms to support coroner investigations and provide clear processes for bereaved families to obtain answers. This is not authoritarianism – it is basic accountability
To repeal the Online Safety Act would indeed be a massive own-goal and a win for Elon Musk and the other tech giants who care nothing for our children’s safety. The protections of the Act were too hard won, and are simply too important, to turn our back on.
The conflation of regulating pornographic content with censoring legitimate information is neither accurate nor helpful, but we must remain vigilant against mission creep. As Victoria Collins MP and I have highlighted in our recent letter to the Secretary of State, supporting the Act’s core mission does not mean we should ignore legitimate concerns about its implementation. Parliament must retain its vital role in scrutinising how this legislation is being rolled out to ensure it achieves its intended purpose without unintended consequences.
There are significant issues emerging that Parliament must address:
Age Assurance Challenges: The concern that children may use VPNs to sidestep age verification systems is real, though it should not invalidate the protection provided to the majority who do not circumvent these measures. We need robust oversight to ensure age assurance measures are both effective and proportionate.
Overreach in Content Moderation: The age-gating of political content and categorisation of educational resources like Wikipedia represents a concerning drift from the Act’s original intent. The legislation was designed to protect children from harmful content, not to restrict access to legitimate political discourse or educational materials. Wikimedia’s legal challenge regarding its categorisation illustrates this. While Wikipedia’s concerns about volunteer safety and editorial integrity are legitimate, their challenge does not oppose the Online Safety Act as a whole, but rather seeks clarity about how its unique structure should be treated under the regulations.
Protecting Vulnerable Communities: When important forums dealing with LGBTQ+ rights, sexual health, or other sensitive support topics are inappropriately age-gated, we risk cutting off vital lifelines for young people who need them most. This contradicts the Act’s protective purpose.
Privacy and Data Protection: While the Act contains explicit privacy safeguards, ongoing vigilance is needed to ensure age assurance systems truly operate on privacy-preserving principles with robust data minimisation and security measures.
The solution to these implementation challenges is not repeal, but proper parliamentary oversight. Parliament needs the opportunity to review the Act’s implementation through post-legislative scrutiny and the chance to examine whether Ofcom is interpreting the legislation in line with its original intent and whether further legislative refinements may be necessary.
A cross-party Committee from both Houses, would provide the essential scrutiny needed to ensure the Act fulfils its central aim of keeping children safe online without unintended consequences.
Fundamentally and importantly, this approach aligns with core liberal principles. John Stuart Mill’s harm principle explicitly recognises that individual liberty must be constrained when it causes harm to others.
.
4th February 2023
Crossparty work yet to do on the Online Safety Bill
27th November 2021
Peers Advocate the Value of Music Therapy for Dementia
18th July 2015
Lord C-J debates the future of the BBC
1st December 2014






