Cheese, pasta, butter, jam, “liquid calories.” In one of her first Instagram posts, 14-year-old Ashleigh Ponder lays out all the foods she is afraid to eat. It was September 2013, and the beginning of her effort to document her struggle with anorexia — and her recovery — on Instagram.
Ponder says she owes her life to Instagram. At her lowest weight, she was just over 79 pounds. Now at 154 pounds, 17-year-old Ponder — under the username @balancednotclean — has become somewhat of a healthy-lifestyle role model, promoting healthy eating and exercise to the more than 23,700 followers.
Ponder is among countless individuals who have turned to social media to document eating disorders and other kinds of self-harm. On Instagram alone, the hashtag, #anorexia has more than 4.6 million posts. #suicide has more than 5.7 million.
Instagram recently joined parent company Facebook in taking steps to help its users suffering from eating disorders. The initiative borrows from a tool created by Facebook in 2015 to address self-harm and suicide. Instagram is the first social media platform to specifically address eating disorders.
With its new tool, Instagram allows users to report posts that they feel suggest self-harm. Instagram then reviews the flagged posts. (The company says there are hundreds of employees working around the clock, seven days a week, to field the reports.) If a post is deemed to qualify as “self-injury,” a message is sent to the reported user that offers three suggested options: “Talk to a friend,” “Contact a helpline” and “Get tips and support.”
Instagram has created two different pop-up windows, one for general self-injury and one specifically for eating disorders. Windows for both have the same three header messages but the content is geared toward either general self-injury or eating disorders.
Users who seem to be at risk of self-injury are led to a page that suggests options: “Get outside,” “Be creative,” “Soothe your senses” and “Just relax.” Users who may be facing eating disorders are led through a series of images paired with phrases — a photo of a sunset reads “Take three deep breaths;” a picture of two cats looking out a window says “Invite someone to watch a movie.”
A message is also sent to the person who reported the concern, letting him or her know Instagram has reached out.
Certain search terms like “anorexia,” “suicide” and “cutting” will also trigger the messages. But this feature seems fairly basic. Posts such as #killme (with over 2 million) and #killmenow (with over half a million) do not trigger the warning.
Janis Whitlock, director of Cornell University’s Research Program on Self-Injurious Behaviors, has studied the interaction between mental health and social media-linked behavior. She says it’s not uncommon for people suffering from self-injury to write about it via social media.
“One of the things that’s abundantly clear is that people will disclose in social media and internet-based venues things that a lot of other people don’t know — maybe nobody in their life knows,” she said. She commends Instagram for attempting to help its users suffering from self-injury behaviors. “I applaud [Instagram] for making an effort to really effectively interact, to identify and capture people at the moment of their crisis,” she said.
But how effective is Instagram’s new tool at really at addressing severe mental health disorders like anorexia and suicide?
Whitlock says the tool is well thought-out; that giving people a moment to pause and breathe is often an effective way of preventing self-harm.
“For someone who self-injures, often times if they can just pause the urge for even just 15 minutes, then the urge to injure will pass,” she said. “I would imagine for some subset if you provide these suggestions in just the right moment that it would get them through an urge.”
But Whitlock says she worries about user fatigue. “I also wonder what would happen if I’m a poster and I see this flow multiple times if it’ll start to just loose kind of efficacy with me,” she said. “So my one hypothesis would be that it would have short-term spikes in effectiveness because of the novelty effect.”
Ponder had similar concerns. “If someone is really trying and they keep getting reported for struggling, that can be really demotivating,” she said, noting any follower can report, including people who may not have any understanding of that person’s illness, history or experiences.
Terry Sandbek, a clinical psychologist specializing in eating disorders, says the tool is certainly better than nothing. It will at least help users not feel so alone, he says, which he noted was a “huge” problem among those with eating disorders.
As for the specific treatment, he says it’s right out of the psychology handbook. “It sounds like they’re taking a page out of the DSMV playbook in terms of emotional regulation,” he said. But he noted “it’s pretty thin,” and “a very simple start.” He also said it would probably only have the potential to influence users who are already interested in recovery.
Sandbek and Whitlock both expressed a wish to see Instagram conduct research on how the tool is working, and adapt it based on the findings.
Instagram needs “to be very cautious and see where this going to go and what some of the results are going to be,” said Sandbek. And he suggested the company use “pretty strict monitoring and grabbing data to see what’s happening so that it doesn’t rebound in the wrong direction.”
Dr. Nancy Zucker, an associate professor of psychiatry and behavioral sciences at Duke University who collaborated with Instagram on the eating disorders tool, says Instagram initially came to her for advice about censoring content. Instead they collaborated on an interactive self-help tool.
Does censorship work as a prevention strategy? Research has been mixed, says Zucker. For people who are involved in communities focused on eating disorders, “it’s not a simple story. They’re getting support from other people, they feel very lonely, you know, there’s no one who understands what they’re going through.”
So on Zucker’s suggestion, Instagram collaborated on a different approach. “We wanted to be able to offer a moment of peace to these people that are suffering in the hopes that by having an emotional connection with someone out there in the world, that they might not even know, that might just give them a few seconds to stop and breathe and maybe get one step closer to enacting change,” said Zucker.
It does appear that Instagram has censored some hashtags — #selfharm and #selfinjury noticeably have “no posts.”
To develop the tools, Instagram says it worked closely with experts and organizations including the National Eating Disorders Association, University of Washington’s Forefront: Innovations in Suicide Prevention program, the National Suicide Prevention Lifeline, Save.org, Samaritans, BeyondBlue and Headspace, as well as with people who suffer from self-injury. Facebook cites the same organizations in creating its 2015 tool.
Nicky Jackson Colaco, director of public policy at Instagram, says the platform is committed to increasing safety. “This idea that Instagram can truly be an innovator when it comes to safety and well being is something that is incredibly important from high-up levels of the company … So I think 2017 will be a year where we actively move that mission forward,” she said.
Although some psychologists see Instagram’s new tool as “a good start” but somewhat thin, Zucker sees social media reporting as the future of mental health treatment. “As a mental health professional, you get overwhelmed by the reach that they have in the Instagram community. So if you can get a positive message out there, it is such a powerful medium to help people,” she said. “I think that it’s kind of where mental health treatment needs to go.”
The post Can Instagram’s new tool really help users who self-harm? appeared first on PBS NewsHour.