News
Instagram wrestles with mental health concerns

- March 15, 2019
- Updated: July 2, 2025 at 5:13 AM


Earlier this month, Instagram announced they’d be cracking down on self-harm imagery. Or rather, they’ve doubled down on the sensitive content filters implemented in 2017.
The filters, or “sensitivity screens,” blur out content related to suicide or self-harm and users can tap the screen to view the image. The idea was to reduce the likelihood of stumbling upon unwanted or harmful content.
Since then, Instagram has been dubbed the worst social media platform for mental health, particularly among young people.
Social media worse for mental health than we thought
Read Now ►From body dysmorphia to eating disorders, self-harm, and low self-esteem — the platform is an undeniable minefield for all kinds of mental health issues.
The initial filters didn’t ban images of cutting or other forms of self-harm, as Instagram didn’t want to ban posts from those who were struggling.
While most of the dialog surrounding social media and mental health has focused on things like FOMO and body image issues, the conversation has shifted toward what content should be banned in an effort to protect vulnerable users.
Here’s a little more about Instagram’s push toward a healthier platform–and why.
Why is Instagram increasing controls on sensitive content?
The decision to ramp up controls over self-harm posts came two years after 14-year-old Molly Russell took her own life. Molly’s parents believe that content found on Pinterest and Instagram may have played a role in her death. Her father, Ian Russell has been vocal about the failure of social media companies to protect their young users.
In the wake of the tragedy, UK health secretary Matt Hancock warned Facebook (who, of course, owns Instagram) that he would take legal action to protect young people from harmful content. In a letter to “social media bosses,” Hancock said it is appalling how easily young people can access content that “leads to self-harm and suicide.”
Adam Mosseri, the head of Instagram, responded by announcing that they would roll out sensitivity screens blocking images that depict self-harm.

“We still allow people to share that they are struggling,” Mosseri stated. The app’s policy only bans images that “glorify” self-harm. The platform does hide posts that reference self-harm or post non-graphic depictions.
As it stands, the app does ban promoting content that relates to self-harm or suicide. However, they’ve struggled to gain control over the content the algorithm recommends — allowing graphic imagery to pass through the filters anyway.
Instagram’s algorithm is set up so users receive personalized recommendations based on past activity. So, despite Instagram’s efforts, the platform guides users seeking self-harm content toward other posts/accounts that align with their “interests.”
Mosseri admits in an op-ed that the platform hasn’t done enough to keep these images out of feeds. While the 2017 effort was a step in the right direction, this latest move involves allocating more resources to fighting back — content moderators and engineers.
Social media adds 24/7 emotional burdens
Read Now ►Is it Instagram’s responsibility to ban sensitive content?

Early reports of the tool in action show Instagram did censor certain hashtags — obvious ones like #selfharm or #selfinjury came with the sensitive content label, but the issue was, users could just add new hashtags and circumvent the filters.
Like some of the biased algorithms we’ve seen in the news recently — problems aren’t evident until something bad happens. A tool isn’t programmed to have racist inclinations or built-in gender bias, but once it’s out in the wild, dealing with real people, it doesn’t have the “training” to respond appropriately to every situation.
It’s troubling that the platform has been recommending sensitive content to a vulnerable audience. But it’s unclear if Instagram’s ban will extend to accounts that raise awareness for mental health issues or speak out about their experience with depression.
As we’ve seen with Facebook’s crackdown on fake news and political ads, many accounts may find themselves subject to penalty. But, it’s certainly a step in the right direction.
Can your Instagram posts reveal depression?
Read now ►
Top 7 mental health YouTubers
Read Now ►Grace is a painter turned freelance writer who specializes in blogging, content strategy, and sales copy. She primarily lends her skills to SaaS, tech, and digital marketing companies.
Latest from Grace Sweeney
You may also like
NewsIf you were hoping for a remake or a PC port of Bloodborne, you'll have to settle for this animated film adaptation
Read more
NewsThe latest big movie of Jackie Chan is now available for streaming
Read more
NewsA group of hackers reveals internal data from Rockstar Games and shows that they make even more money than we thought
Read more
NewsEverything we know so far about the third season of Wednesday
Read more
NewsThe new Starfield update changes how New Game+ works because it was "misunderstood," according to its director
Read more
NewsChucky, the killer doll, returns to theaters after the cancellation of his television series
Read more