LONDON: Britainhas urged all social media platforms to join Instagram and curb self-harm postsafter a UK teen who went online to read about suicide took her own life. TheFacebook-owned picture sharing platform's ban on "graphic" contentwas announced following a meeting Thursday between its global chief AdamMosseri and UK Health Secretary Matt Hancock. British teen Molly Russell tookher own life in 2017. The 14-year-old's social media history revealed that shefollowed accounts about depression and suicide. The case sparked a vigorousdebate in Britain about parental control and state regulation of children'ssocial media use. British Prime Minister Theresa May welcomed the move"and encourages other platforms to take the sameapproach," her spokeswoman said Friday.

Molly's parentsdid not directly blame Instagram for the loss of their daughter. But her fatherIan cited the easy access to such posts on Instagram and Pinterest-a newer sitealso built around images-as a contributing factor that platforms must notignore. "The more I looked (into her online accounts), the more there wasthat chill horror that I was getting a glimpse into something that had suchprofound effects on my lovely daughter," Ian Russell told The Times lastmonth. He called Instagram's decision Thursday "encouraging". "Ihope that the company acts swiftly to implement these plans and make good ontheir commitments," Ian Russell said.

Cry for help

Instagram'sMosseri said the changes followed a comprehensive review involving experts andacademics on children's mental health issues. "I joined the company morethan 10 years ago and we were primarily focused on all of the good that cameout of connecting people," Mosseri told The Telegraph newspaper. "Butif I am honest we were under-focused on the risks of connecting so many people.That's a lesson we have learned over the last few years."

Instagram hasnever allowed posts that promote or encourage suicide or self-harm. But it willnow ban "graphic self-harm" images and remove references to lessexplicit posts about people hurting themselves from its searches andrecommendations. It will also clamp down on hashtags-words featuring a"#" that mark a trending topic-relating to suicide. The measures aremeant to make such posts more difficult to find for depressed teens who mighthave suicidal tendencies. "We are not removing this type of content fromInstagram entirely," said Mosseri. "We don't want to stigmatize orisolate people who may be in distress and posting self-harm related content asa cry for help."

'Carefulregulation'

Social mediaplatforms are coming under increasing scrutiny as they expand in reach andcultural influence. Facebook founder Mark Zuckerberg said last year that hethought more regulation of the industry was "inevitable" because ofthe internet's size. "My position is not that there should be noregulation," Zuckerberg told an April 2018 US congressional hearing."But I also think that you have to be careful about regulation you put inplace." Instagram's Mosseri told The Telegraph that he supported statutesbeing considered by the British government "as a concept".

"There is alot of regulation already," said Mosseri. "We think it's important tocollaborate with policymakers so that ... whatever legislation or processesthey put in place work, make sense." The UK government will this monthpublish a "white paper" on harmful online behavior that will be usedas guideline for possible oversight rules. "The task is to design a systemof oversight that preserves what is best and most innovative about the onlinecompanies but which also insists that they do what they can to keep the usersof their services safe," Culture Minister Jeremy Wright wrote in TheTimes. - AFP