Just last week, Design It For Us, in partnership with the HEAT Initiative and Parents Together Action, released damning new research that revealed the continued failure of Meta’s teen accounts to protect young users from harm. Our polling exposed how teens are still regularly exposed to content promoting suicide, eating disorders, drug paraphernalia, sexually explicit material, and violent content. We also found that adults could still message young users and were frequently funneled toward inappropriate or unsafe accounts.

This follows evidence we released in May 2025, finding that Meta’s teen accounts did not always uphold their own policy standards on teen accounts. In fact, our testing revealed that 5 out of 5 of our test Teen Accounts were algorithmically recommended sensitive content, despite Meta’s default sensitive content controls being enabled. In response, Meta dismissed our concerns, generalizing this type of evidence as “‘unobjectionable” or consistent with “humor from a PG-13 film.”

Now, Instagram is making that dismissal their official policy. They’ve announced Teen Account standards will align with “PG-13 movie ratings”— a warning cautioning that some material may be inappropriate for children under 13. Compared to the protections Teen Accounts were meant to provide for users ages 13 to 17, the PG-13 framing itself is a downgrade. Meta’s Teen Accounts protections for users as young as 13 were intended to be the minimum level of protection, not the maximum. Now, Meta is codifying inadequate protections as the upper limit of what young users can expect.

Let’s be clear: Meta is not fixing the problem; they’re rebranding it. Instead of releasing data that shows how their product actually works for teens, Meta offers PR stunts and superficial tweaks— parent focus groups, vague content labels, and now movie ratings as policy. Even the makers of the ratings have distanced themselves from Meta after this development. 

“These platforms are designed to maximize engagement at all costs because that’s what drives profit. As long as that’s the foundation, safety will always be an afterthought – no matter how many performative measures they announce,” said Kendall Schrohe, Campaigns Lead at Design It For Us. “Meta is not interested in protecting teens. They’re interested in protecting their business model.”

Our work continues to lead this conversation, not just exposing the harm, but forcing the world’s biggest social media company to respond until meaningful change is made.

###

Design It For Us has been driving the fight to hold Big Tech companies like Meta accountable: 

  • In April 2025, Design It For Us joined Parents Together and the Heat Initiative to host a rally at Meta’s headquarters in New York City. We delivered a petition of more than 10,000 signatures of young people and parents demanding meaningful accountability from Meta and Mark Zuckerberg.
  • Design It For Us partnered with Accountable Tech in May 2025 to release a test of the efficacy of Meta’s Teen Accounts’ protections. In the test, youth volunteers from the coalition created test accounts and recorded their experiences using Teen Accounts over the course of two weeks. Our test accounts found that Meta’s policies did not always work, including that:
    • 5 out of 5 of our test Teen Accounts were algorithmically recommended sensitive content, despite Meta’s default sensitive content controls being enabled.
    • 5 out of 5 of our test Teen Accounts were algorithmically recommended sexual content.
  • On the 1-year anniversary of the rollout of Teen Accounts this past September, DIFU advocate Alia ElKattan stood alongside Meta whistleblowers and parents in a press conference to call out Meta’s persistent refusal to be transparent and accountable to its users. 
  • This month, Design It For Us collaborated again with Heat Initiative and Parents Together to release a study finding that Meta’s Teen Accounts were still failing to stem safety concerns for teens.
    • Nearly 60% of the kids who received unwanted messages said they came from users they believe to be adults. 
    • Nearly 40% of kids who got unwanted messages said they came from someone who wanted to start a sexual or romantic relationship with them.