There is no greater priority for the NSPCC than making sure our children are safe from abuse online. Children are being groomed at scale on social networks, child abuse images are being freely produced but not consistently taken down and vulnerable young people are being exposed to highly disturbing content that promotes or glorifies self-harm and suicide. Traumatised families are left to try and pick up the pieces.

This is a crucial moment.

Ministers are expected to set out plans for an Online Harms Bill within weeks and we will find out whether the Government is committed to delivering a bold and ambitious piece of legislation that sets the global standard for protecting children, or whether its proposals will fall short.

Since 2017 we have been campaigning for the Government to bring an end to self- regulation. It has failed, because for far too long, tech platforms have considered children’s safety to be outside their business models.

But these firms are not, and have never been, neutral actors. Their decisions exacerbate the risks to children. That’s why we’re calling for a Duty of Care that will bring in a legal responsibility to identify and act on the reasonably foreseeable risks that result from how their services are designed and run.

The stakes couldn’t be higher, and the argument for regulation has never been stronger. Lockdown has created a perfect storm for online abuse. We don’t yet know the true scale, but we do know young people spent longer on platforms with fewer moderators. We also know that offenders viewed Covid-19 as an opportunity to escalate abuse against young people.

Over recent months, we have seen an increase in calls about online child abuse to both Childline and our adult helpline. We have heard harrowing stories of young people being emotionally manipulated into sending sexually explicit images and then coerced into sending more.

Tech firms have been left utterly exposed. The failure to design basic child protection into their services and invest in technology that could identify and disrupt abusers, meant that social networks could be exploited ruthlessly.

But, as with so much, the pandemic’s impact on online safety is a long-term one. Many more children will routinely use livestreaming and video chat services, which present sharply increased child abuse risks, and are often poorly moderated. We’re likely to see an increase in demand for child abuse material, and the grooming that fuels it, because of changes to working patterns that will result in more abusers working in unsupervised conditions from home.

We urgently need legislation that holds tech firms firmly to account.

I raised these risks with the Prime Minister at a summit on the hidden victims of Covid-19. He was adamant that this should not continue. He signalled his personal determination to legislate for ambitious regulation that could combat the growing scale and complexity of online abuse.

It’s now time for the Government to translate this vision into action – the age of self- regulation must come to an end through an online harms law that delivers meaningful and lasting change.

Today we are setting out six tests that the Government must meet if it is to deliver on its ambition to make the UK the safest place in the world to go online. These tests provide a roadmap to world-leading regulation that can protect children from entirely preventable online harm.

New laws must force companies to take a proactive approach to tackling child abuse, and set clear expectations that tech firms must work together to respond to a constantly evolving threat.

The Government must treat suicide and self-harm imagery with the seriousness it deserves, not just leave it to the platforms to set their own rules.

The regulator needs powers to hold companies to account. This must include criminal and financial sanctions on tech firms and their senior manager making decisions that put children at risk.

The Government should commit to a user advocate for children, funded by the industry levy, to ensure a level playing for children against powerful corporate interests. This reflects the well-established ‘polluter pays’ principle: that the companies responsible for harm should bear the financial costs of addressing it.

The chance to change the world for the better for our children doesn’t come around very often. In the Online Harms Bill, the Government has an opportunity to do just that. Produce a paper tiger and ministers will leave young people increasingly exposed to preventable abuse.

However, if the Government brings forward bold and ambitious legislation it will allow a generation – whose future is in many ways uncertain – to grab the opportunities of our digital future with both hands.

Peter Wanless is chief executive of the NSPCC

Source Article