Britain’s culture secretary has said she is “not ruling out” changing the online safety bill to allow regulators to prosecute social media bosses found to have failed to protect the safety of children.
Michele Donelan told the BBC on Friday that she was open to making the changes demanded by dozens of Conservative MPs, saying she would take a “sensible approach” to their ideas.
His comments came a day after Downing Street said it was considering measures backed by 36 Conservative MPs that could see executives jailed for up to two years for breaches of the law.
MPs who have signed the amendment include former Home Secretary Priti Patel and former Work and Pensions Secretary Iain Duncan Smith.
Labor also confirmed on Friday that they would back the amendment, increasing pressure on the prime minister to back down.
Lucy Powell, the shadow culture secretary, said: “Workers have been calling for criminal accountability for those who run these companies throughout the passage of the bill and we will join forces across the house to strengthen it in this way.”
The amendment would give Ofcom, the communications watchdog, the power to prosecute executives of social media companies found to have broken the law. If ministers include him in the bill, it will be the third time Prime Minister Rishi Sunak has bowed to lawmakers’ demands for him, after U-shapes in planning and onshore wind farms.
A Downing Street spokesperson said on Thursday: “Our aim is to hold social media platforms accountable for harmful content, whilst ensuring the UK remains a great place to invest in and grow a tech business. We are confident that we can achieve both. We will carefully consider all proposed amendments to the online security bill and will state the position when the report stage continues.”
The bill aims to crack down on a variety of online content that ministers believe is causing serious harm to users and was informed in part by testimony from Frances Haugen, a former Facebook employee who accused the company to repeatedly put profits before user safety. .
He bill It will force companies to remove any content that promotes self-harm, depicts sexual violence, or facilitates suicide. It will also require companies to impose and enforce strict age limits and publish assessments of the risks their platforms pose to young people.
As currently drafted, the bill gives Ofcom the power to fine companies up to 10% of their global turnover for breaching the law. Ofcom will be able to prosecute executives only if they fail to cooperate with an investigation. However, this has angered many Conservative MPs, who believe the regulator should have more stringent powers.
The amendment, which has been signed by 37 deputies In general, it would allow Ofcom to prosecute individual executives if they were proven to have colluded or consented to breaching elements of the bill designed to protect the safety of children. Judges could impose prison sentences of up to two years.
The NSPCC has backed the amendment, demanding that “the responsibility stop with senior management for the safety of our children.”
The father of Molly Russell, the 14-year-old girl who took her own life in 2017 after seeing harmful content related to suicide and self-harm on social media, backed the amendment.
Ian Russell said: “In Molly’s investigation, the world saw the scale of the incredibly distressing content she was exposed to as a vulnerable child suffering from mental health issues. No one has yet taken any personal responsibility for how social media contributed to her death.
“Including senior management accountability in the online safety bill is an opportunity to prevent this from happening again and focus the minds of tech bosses on ensuring their platforms are safe online spaces for children.
“I urge the culture secretary and the prime minister to listen to activists and a growing number of their own parliamentarians and accept this crucial amendment to the proposed legislation.”
Other changes to the bill, which has its report and third stage reading in the House of Commons next week, include modifying previous plans to address content viewed by adults that is harmful but below the crime, such as cyberbullying and sexist and racist. material.
Tech companies will need to clearly state in their terms and conditions how they will moderate such content. Users will also have the option to request that such content be filtered when on social media platforms.