Instagram has agreed to prohibit graphic images of self-harm after objections were raised in Britain following the suicide of a teen whose dad said the photo-sharing system had contributed to her decision to take her own life.
Instagram chief Adam Mosseri said Thursday evening the platform is making a series of changes to its content rules.
He explained:”We are not where we will need to be on self-harm and suicide, and we need to do more to protect the vulnerable in our area .”
Mosseri said further modifications will be made.
“I have a responsibility to get this right,” he said. “We will get better and we’re committed to discovering and removing this material at scaleand working together with specialists and the wider industry to find ways to encourage people when they are most in need.”
The call for modifications has been backed by the British government after the family of 14-year-old Molly Russell discovered substance related to suicide and depression on her Instagram account following her death in 2017.
Her father, Ian Russell, said he believes the content Molly seen on Instagram played a leading role in her departure, a charge that received broad attention in the British press.
The changes were announced after Instagram and other tech companies, such as Facebook, Snapchat, and Twitter, met with British Health Secretary Matt Hancock and representatives from the Samaritans, a mental health charity that works to prevent suicide.
Instagram can also be removing non-graphic images of self-harm from searches.
Facebook, which owns Instagram, said in an announcement that independent experts suggest that Facebook should”allow individuals to share admissions of self-harm and suicidal thoughts but should not allow people to share content promoting it.”