Instagram tackles self-harm and suicide with new reporting tools, support options

While Twitter continues to struggle with rampant bullying and abuse, Instagram is instead stepping up its game when it comes to keeping users safe from harm. The Facebook-owned photo-sharing network this week began rolling out a new reporting tool that lets its users anonymously flag friends’ posts about self-harm. This, in turn, will prompt a message from Instagram to the user in question, offering support, including tips as well as access to a help line, all of which can be accessed directly in the app itself.

The system itself is fairly simple to use, but addresses a serious need on the social network which continues to have a large teenage and young adult audience.

When you see a post where a friend is threatening self-harm or suicide, you may not always feel comfortable broaching the subject with them. Or you may not know what to say. In other cases, you may be following accounts of people you don’t know that well (or at all), and don’t feel it’s your place to speak out.

Instagram is instead offering a different option. By anonymously flagging the post, the friend will be sent a support message that reads, “Someone saw one of your posts and thinks you might be going through a difficult time. If you need support, we’d like to help.”

The recipient can then click through to see a list of support options, which includes a suggestion to message or call a friend, access more general tips and support or contact a help line — which will vary as determined by the user’s location. 40 organizations around the world are partners on the help line side of the system.

The company also says it worked with National Eating Disorders Association, Dr. Nancy Zucker (Associate Professor of Psychology and Neuroscience at Duke), and Forefront, which is led by academic researchers at the University of Washington, on the new tools. Other organizations including The National Suicide Prevention Lifeline and Save.org (US), Samaritans (UK), beyond blue and headspace (Australia) provided input, too.

updated-self-injury-flow

 

What’s interesting about Instagram’s tool is that it isn’t only triggered by anonymous reporting. Instagram’s app will also direct users to the support message when they search the service for certain hashtags, like the banned search term #thinspo, for example, which is associated with eating disorders.

Dealing with abuse (and self-directed abuse) 

The move is one of several changes Instagram has made in recent days to limit the abuses on its network. In September, it also made it possible for anyone to filter their comments using customizable blocklists — meaning, they could disallow anyone from posting certain explicit words or bullying terms in their Instagram comments.

Steps like these are critical to establishing the community’s tone. Unregulated free speech can devolve into anonymous bullying — as is now so prevalent on Twitter that it has at least partially impacted the network’s ability to find an acquirer. (Reportedly, Disney stepped away from an acquisition because of this problem.)

Posts about self-harm are a different matter than bullying, of course, but they fall under the broader umbrella of user protection and safety.

updated-self-injury-tools-1

They are about establishing a community where it’s safe to share, but also one where certain types of sharing can be moderated for potential problems — whether that’s someone posting harmful comments directed at another person, or someone posting harmful comments directed at themselves.

Not having these sorts of limitations and policies in place can lead to dangerous results, especially when networks cater to a younger audience. For instance, the teen Q&A network became so well-known for bullying back in the day, it ended up being a contributing factor in a series of teen suicides. 

Over the years, the major social media companies have learned from past tragedies to offer better ways to protect their users. Many partner with support organizations to craft and run PSAs when users search for harmful terms — Tumblr, Pinterest and Instagram have all done so. And most today use a combination of automated systems, human moderators and flagging tools to address problems that range from user-led searches for concerning terms and tags (like thinspo or suicide) to prompts that are triggered by users’ posts themselves.

screen-shot-2016-10-19-at-3-38-36-pm

Instagram’s new flagging tools are designed after those that are already in place on parent company Facebook, and an example of how the smaller network can benefit from the infrastructure Facebook already has in place to address problems related to self-harm.

Last year, Facebook launched an almost identical tool — right down to the wording — for Facebook users in the U.S. And earlier in 2016, it rolled this out to international users as well.

Along with the launch of the new tools, Instagram also partnered with Seventeen on a campaign focused on body image and self-confidence, National Body Confidence Day. This is running now under the hashtag, #PerfectlyMe. The November issue of Seventeen will also include 16 pages in the magazine in support of body confidence and #PerfectlyMe.