The changes were made following a comprehensive review with global experts and academics on youth, mental health and suicide prevention.
Recently in the UK, a father had accused Instagram for his 14-year-old daughter Molly Russell’s suicide. In view of this allegation, the app has made noteworthy changes in its announcement that it would roll out ‘sensitivity screens’ that would block images and videos of self harming. UK’s Health Secretary Matt Hancock had also asked the tech companies to clean their platforms from content containing people cutting themselves. The new changes go a step further.
In the latest revelation, Adam Mosseri, head of product at Instagram, said in a blog post that it is banning all images of cutting and hiding non-graphic images of self-harm like scarring, from search results, hashtags, and Explore. “These images won’t be removed entirely, because the platform doesn’t want to stigmatise or isolate people who may be in distress and post self-harm related content as a cry for help. The changes were made following a comprehensive review with global experts and academics on youth, mental health and suicide prevention,” it said in the post.
The company opines that it would be a difficult task to identify if an image shows self-harm, encourages it, raises awareness or is a cry for help. “It’s further complicated by the fact that not all people who self-harm have suicidal intentions, and the behaviour has its own nuances apart from suicidality,” Adam wrote, adding, “Right now, it’s not hard to find images of cutting or scarring on Instagram. Up until now, we’ve focused most of our approach on trying to help the individual, who is sharing their experiences around self-harm. We have allowed content that shows contemplation or admission of self-harm because experts have told us it can help people get the support they need. But we need to do more to consider the effect of these images on other people who might see them. This is a difficult, but important balance to get right. It will take time and we have a responsibility to get this right. Our aim is to have no graphic self-harm or graphic suicide-related content on Instagram while still ensuring that we support those using Instagram to connect with communities of support.”
He added that in addition to the app investing in technology that helps with detection, it also relies on users to report self-harm content.