Social media firms have ‘lost control’ of self harm material, Children’s Commissioner warns

Social media firms have 'lost control' of self harm material, Children's Commissioner warns

In the wake of his comments, the Health Secretary Matt Hancock has written to social media and tech companies saying he is “appalled” at how easy suicide material is to find on their sites.

He also warned that Parliament could “ban” access to them if they did not act.

In her letter, Ms Longfield said: “The tragic suicide of Molly Russell and her father’s appalled response to the material she was viewing on social media before her death have again highlighted the horrific amount of disturbing content that children are accessing online.

“I do not think it is going too far to question whether even you, the owners (of tech companies), any longer have any control over their content.

“If that is the case, then children should not be accessing your services at all, and parents should be aware that the idea of any authority overseeing algorithms and content is a mirage.”

She said in recent years she had discussed with tech firms ways they could protect children but she was not convinced that they are taking the efforts seriously.

“I have been reassured time and time again that this is an issue taken seriously,” said Ms Longfield.

“However, I believe that there is still a failure to engage and that children remain an afterthought.”

The Children’s Commissioner for England has statutory powers to protect children, including to demand data and information from public bodies.

Although her powers do not extend to private companies, Ms Longfield called on social media companies to voluntarily reveal the extent of self harm material being published on their sites as well as how many children and teenagers are viewing it.

Earlier this week Sir Nick Clegg, who has recently joined Instagram’s parent company Facebook as vice president of global affairs and communications, defended its approach to suicide and self harm material.

He said the company had been advised not to take down all such images as a matter of course and that Facebook had save thousands of lives by highlighting users suicidal behaviour to mental health charities and authorities.

Responding to the letter from the Children’s Commissioner, Facebook said: “We have a huge responsibility to make sure young people are safe on our platforms and working together with the Government, the Children’s Commissioner and other companies is the only way to make sure we get this right.”

Source link