APD News
Close

APD NewsAPP, New stage!

Click to download

Social media firms face fines in harmful content crackdown

Science

2019-04-08 01:24

New rules will see social media companies being legally required to protect users, with bosses potentially held personally liable if they do not comply.

The proposed measures form part of a government plan to make the UK one of the safest places in the world to be online.

Charities and campaigners have been calling for greater regulation, following concerns over the growth of violent content, encouraging suicide, disinformation and the exposure of children to cyber bullying and other inappropriate material.

Those two groups have welcomed the plans, although a trade body has warned they may be too broad in scope to be effective.

The proposals on online harms, drawn up by the Home Office and Department for Digital, Culture, Media and Sport, say a regulator will be appointed to ensure companies meet their responsibilities.

These will be laid out in a new mandatory duty of care, which will require firms to take more responsibility for the safety of users and be more proactive in tackling the harm caused on their platforms.

The regulator - either a new body or an existing one like OFCOM - will have the ability to hit companies with "substantial" fines, block access to their sites and "potentially impose liability on individual member of senior management".

Prime Minister Theresa May said the plans, contained in a government white paper which will now go out for a 12-week consultation, demonstrate that the age of self-regulation is over.

"The internet can be brilliant at connecting people across the world - but for too long these companies have not done enough to protect users, especially children and young people, from harmful content," she said.

Instagram failing to remove graphic self-harm images

"That is not good enough, and it is time to do things differently.

"We have listened to campaigners and parents, and are putting a legal duty of care on internet companies to keep people safe.

"Online companies must start taking responsibility for their platforms, and help restore public trust in this technology."

Home Secretary Sajid Javid said firms had a "moral duty" to protect the young people they "profit from".

"Despite our repeated calls to action, harmful and illegal content - including child abuse and terrorism - is still too readily available online," he said.

Tory Chairman of the DCMS: Tech companies need to take 'more responsibility'

"That is why we are forcing these firms to clean up their act once and for all."

Children's charity NSPCC, which has campaigned for tough regulations, welcomed the proposals.

Chief executive Peter Wanless said they would mean the UK is a "world pioneer" in protecting children online.

Javed Khan, chief executive at Barnardo's, said they had "long called for new laws to protect children online, just as we do offline, so they can learn, play and communicate safely".

But Daniel Dyball, UK executive director at trade body the Internet Association, sounded a warning about the "extremely wide" scope of the proposals.

Health secretary: Companies 'need to show action'

"The internet industry is committed to working together with government and civil society to ensure the UK is a safe place to be online," he said.

"But to do this, we need proposals that are targeted and practical to implement for platforms both big and small."

The proposals will cover any company that allows users to share or discover user-generated content or interact with each other online.

Ministers also want the regulator to be able to force social media companies to publish annual transparency reports on harmful content and how they are addressing it.

The likes of Facebook and Twitter already do this.

Responding to the proposals, Facebook's UK head of public policy Rebecca Stimson said: "We have responsibilities to keep people safe on our services and we share the government's commitment to tackling harmful content online.

"As Mark Zuckerberg [Facebook's founder] said last month, new regulations are needed so that we have a standardised approach across platforms, and private companies aren't making so many important decisions alone."

Families say social media sites have contributed to teen suicides

She added that although Facebook had trebled the number of staff tasked with identifying harmful material and continued to review its policies, "we know there is much more to do".

"New rules for the internet should protect society from harm while also supporting innovation, the digital economy and freedom of speech," she continued.

"These are complex issues to get right and we look forward to working with the government and Parliament to ensure new regulations are effective."

The government's final proposals will be published after the consultation has ended.

(SKY NEWS)