LONDON: Britain
has urged all social media platforms to join Instagram and curb self-harm posts
after a UK teen who went online to read about suicide took her own life. The
Facebook-owned picture sharing platform's ban on "graphic" content
was announced following a meeting Thursday between its global chief Adam
Mosseri and UK Health Secretary Matt Hancock. British teen Molly Russell took
her own life in 2017. The 14-year-old's social media history revealed that she
followed accounts about depression and suicide. The case sparked a vigorous
debate in Britain about parental control and state regulation of children's
social media use. British Prime Minister Theresa May welcomed the move
"and encourages other platforms to take the same
approach," her spokeswoman said Friday.
Molly's parents
did not directly blame Instagram for the loss of their daughter. But her father
Ian cited the easy access to such posts on Instagram and Pinterest-a newer site
also built around images-as a contributing factor that platforms must not
ignore. "The more I looked (into her online accounts), the more there was
that chill horror that I was getting a glimpse into something that had such
profound effects on my lovely daughter," Ian Russell told The Times last
month. He called Instagram's decision Thursday "encouraging". "I
hope that the company acts swiftly to implement these plans and make good on
their commitments," Ian Russell said.
Cry for help
Instagram's
Mosseri said the changes followed a comprehensive review involving experts and
academics on children's mental health issues. "I joined the company more
than 10 years ago and we were primarily focused on all of the good that came
out of connecting people," Mosseri told The Telegraph newspaper. "But
if I am honest we were under-focused on the risks of connecting so many people.
That's a lesson we have learned over the last few years."
Instagram has
never allowed posts that promote or encourage suicide or self-harm. But it will
now ban "graphic self-harm" images and remove references to less
explicit posts about people hurting themselves from its searches and
recommendations. It will also clamp down on hashtags-words featuring a
"#" that mark a trending topic-relating to suicide. The measures are
meant to make such posts more difficult to find for depressed teens who might
have suicidal tendencies. "We are not removing this type of content from
Instagram entirely," said Mosseri. "We don't want to stigmatize or
isolate people who may be in distress and posting self-harm related content as
a cry for help."
'Careful
regulation'
Social media
platforms are coming under increasing scrutiny as they expand in reach and
cultural influence. Facebook founder Mark Zuckerberg said last year that he
thought more regulation of the industry was "inevitable" because of
the internet's size. "My position is not that there should be no
regulation," Zuckerberg told an April 2018 US congressional hearing.
"But I also think that you have to be careful about regulation you put in
place." Instagram's Mosseri told The Telegraph that he supported statutes
being considered by the British government "as a concept".
"There is a
lot of regulation already," said Mosseri. "We think it's important to
collaborate with policymakers so that ... whatever legislation or processes
they put in place work, make sense." The UK government will this month
publish a "white paper" on harmful online behavior that will be used
as guideline for possible oversight rules. "The task is to design a system
of oversight that preserves what is best and most innovative about the online
companies but which also insists that they do what they can to keep the users
of their services safe," Culture Minister Jeremy Wright wrote in The
Times. - AFP