Tech firms work with Samaritans on purging harmful content
Facebook, Google, Snapchat and Instagram should develop technology that can identify and tackle harmful content, including that promoting suicide, Health Secretary Matt Hancock will say today.
Representatives from the four social media giants have been summoned by the government to meet with the Samaritans to discuss how to purge self-harm videos and other content from the internet.
It comes three weeks after the government announced plans to make tech giants and social networks more accountable for harmful material online.
The behind-closed-doors meeting today will be the second involving social media firms, but will mark the first time the Samaritans have been involved.
The first summit in February resulted in Instagram agreeing to ban graphic images of self-harm from its platform.
Speaking ahead of the latest meeting, Mr Hancock said: "I want the UK to be the safest place to be online and give parents the confidence to know their children are safe when they use social media.
"As set out in our Online Harms white paper, the government will legislate to tackle harmful content online, but we will also work with social media companies to act now.
"I was very encouraged at our last summit that social media companies agreed normalising or glamorising of eating disorders, suicide and self-harm on social media platforms is never acceptable and the proliferation of this material is causing real harm."
Social media companies and the government have been under pressure to act following the death of 14-year-old Molly Russell in 2017.
The schoolgirl's family found material relating to depression and suicide when they looked at her Instagram account following her death.
In a statement, a spokesman for Facebook, which also owns Instagram, said: "The safety of people, especially young people, using our platforms is our top priority and we are continually investing in ways to ensure everyone on Facebook and Instagram has a positive experience.
"Most recently, as part of an ongoing review with experts, we have updated our policies around suicide, self-harm and eating disorder content so that more will be removed.
"We also continue to invest in our team of 30,000 people working in safety and security, as well as technology, to tackle harmful content. We support the new initiative from the government and the Samaritans, and look forward to our ongoing work with industry to find more ways to keep people safe online."
Ruth Sutherland, chief executive of the Samaritans, said there has been "a worrying growth of dangerous online content".
"There is no black and white solution that protects the public from content on self-harm and suicide, as they are such specific and complex issues," she added.
"That is why we need to work together with tech platforms to identify and remove harmful content whilst being extremely mindful that sharing certain content can be an important source of support for some.
"This partnership marks a collective commitment to learn more about the issues, build knowledge through research and insights from users and implement changes that can ultimately save lives."
(SKY NEWS)