Back to Press

New Report: Instagram Teen Accounts Fail to Deliver Promised Safety Features, Exposing Teens to Harmful and Distressing Content

Design It For Us partnered with Accountable Tech to release a new investigative report revealing that Instagram’s Teen Accounts do not reliably or adequately...

Design It For Us partnered with Accountable Tech to release a new investigative report revealing that Instagram’s Teen Accounts do not reliably or adequately protect young users from harmful content, despite Meta’s public assurances. Using five new test Teen Accounts, we assessed the kinds of content that Instagram served. All of the accounts were shown “sensitive content,” which includes videos or posts that are sexual, promote negative body image and disordered eating, and other disturbing content. Four out of five participants reported distressing experiences while using the Teen Account. The report was covered by the Washington Post

READ THE REPORT.

Design It For Us co-chair Zamaan Qureshi released the following statement:

“The findings were staggering and extremely disheartening, and serve as yet another reminder that we cannot rely on Big Tech to self-regulate. Despite Meta’s promises, teens continue to be recommended addictive and damaging content, with virtually no accountability.

This research confirms what so many of us already know from experience: Instagram’s so-called protections for teens are not working. We have given Meta every opportunity to prove their system works, and what we found instead was a platform that continues to push disturbing and distressing content to young users under the guise of safety.”

Design It For Us co-chair Arielle Geismar released the following statement:

“This is exactly why we need our government to step up. Tech companies have had over two decades to self-regulate, and they have repeatedly failed to prioritize safety over profits. Teens are particularly vulnerable to algorithmic manipulation, and when those algorithms are repeatedly found to recommend sexual content or disordered eating content to 13-year-olds, that’s not a bug— it’s a business model. Federal legislation that requires safety by design is urgently needed.

We cannot trust Big Tech to hold itself accountable. That’s why we need lawmakers to act: to set real standards, enforce transparency, and protect our generation and future generations from the harms of unregulated digital platforms. The time to act is now.”

###

Submit your story

Accepted file types: jpg, png, mp4, mov, wmv, avi, Max. file size: 50 MB.