When you search the hashtag #자해(Korean word for ‘self-injury) on Instagram, you will find more than five million pictures of human body parts covered in blood. Recently, as self-harm posts are becoming more widespread, social media companies have succumbed to calls for greater regulation of these potentially harmful contents. For example, Adam Mosseri, the CEO of Instagram, announced a plan to delete all self-harming images on Instagram. Although he allowed them in the past, as a means of freedom of expression for the site’s users, he plans to ban them from now on. This is a step in the right direction, as social media companies should be responsible for their site content, since they often shape the way people think and behave.
Instagram’s decision to block self-harming images came after England’s Health Minister, Matt Hancock, met with social media companies to discuss greater protection for the mental health of teenagers using their platforms. The meeting was prompted by the tragic death of 14-year-old Molly Russell, a British teen who took her own life in 2017. Ian Russell, Molly’s father, stated on an interview with BBC that Instagram was responsible for her death because the family found search records related to depression and suicide on her account after she passed.
The problem is that it is not difficult to search for posts that could be harmful for the mental health of subscribers on Instagram. For example, if you search ‘self-harm’ on Instagram, a warning message appears, saying ‘post with the words or tags you’re searching for often encourage behavior that can cause harm and even lead to death’. But anyone can access the blurred content behind the warning message by just clicking ‘See Posts Anyway”. In other words, the warning messages are nominally present and rarely serve as a valid block for harmful postings. Furthermore, the sharing system of Instagram can be another factor perpetuating the problem. Instagram’s search engine is designed to show images or videos that are related to the user’s previous search records. So users are constantly exposed to photos or videos related to self-harm if they have searched similar content before. According to a Korea national campaign on suicide prevention, which was developed in cooperation with the Ministry of Health and Welfare, the National Police Agency and the Korean Suicide Prevention Center on suicide last year, Instagram reported the most relevant suicide-related posts. 7,607 of the 13,338 posts related to suicide on Instagram.
Lastly, protecting vulnerable users from an imperfect system should come before freedom of expression. People may think that such postings are not harmful enough to warrant a sharing block at the risk of curtailing freedom of expression. And it is fair to think that images depicting self-harm are just a few photographs of deviant behavior out of the millions posted by users and therefore do not warrant limiting the search system for all users. However, although social media is a platform where people can express their thoughts freely, it is also true that harmful content can ruin the mental health of vulnerable users and trigger negative thoughts and dangerous response. According to Dr. U Virek, a consulting psychiatrist at Renai Medicity in Kochi, Kerala, teenagers may start emulating something they think they can relate to without understanding its consequences. And Dr. Gangadhar BN, Director of National Institute of Mental Health and Neurological Sciences (NUMHANS) in Bengaluru, explained that teenager stress or pressure is amplified when these photos are shared on a digital platform. Therefore, it is crucial to maintain a balance between respecting the freedom of expression of users and protecting vulnerable people from potentially triggering posts, based on a social agreement regarding a minimum standard of harm.
Furthermore, social media companies should act responsibly towards its content given the significant impact they have on the decision-making process of ordinary people. Regarding the plan to delete self-harm posts on Instagram, Adam Mosseri said, “We are not yet where we need to be on the issues of suicide and self-harm. We need to do everything we can to keep the most vulnerable people, who use our platform, safe.”
With the rapid development of technology, social media is a powerful medium used by many people in modern society. Therefore, social media platforms should try their best to ensure an online environment is a safe place for young and vulnerable people. Social media executives have a special responsibility to provide sound and sensible content to the public. For example, they should try their best to ensure the safety of their users by strengthening regulations regarding inappropriate content. Furthermore, images related to ‘suicide’ or ‘violence’ should also be blocked, since they can trigger vulnerable users to act impulsively. For a safe online community, providers should always be aware of the potential influence of their social media product. Users should also be responsible for their posts, instead of just hiding under the banner of freedom of expression.