More

    Instagram’s Teen Accounts still being served graphic content, investigation finds

    Published on:

    The app’s tighter controls are apparently not blocking explicit posts.

    By

    Chase DiBenedetto

     on 



    Share on Facebook



    Share on Twitter



    Share on Flipboard

    The Instagram logo amid a swirl of blurred white lights on a black background.

    Instagram’s teen guardrails may be missing the mark.

    Credit: Jakub Porzycki / NurPhoto via Getty Images

    Following years of criticism for its handling of the youth mental health crisis, Instagram has invested heavily in beefing up its teen safety features, including an entirely new way for underage users to post, communicate, and scroll on the app. But recent tests of these new safety features suggest it may still not be enough.

    According to an investigation conducted by the youth-led nonprofit Design It For Us and watchdog Accountable Tech — later corroborated by Washington Post columnist Geoffrey Fowler — the platform continues to surface age-inappropriate, sexually explicit, and generally harmful content despite content control safeguards.

    In the study, “Gen-Z aged” representatives from Design It For Use tested five fake teen accounts on the app’s default Teen Account settings over a two-week period. In all of the cases, the youth accounts were recommended sensitive and sexual content. Four out of five accounts were recommended content related to poor body image and eating disorders, and only one account’s algorithm surfaced what the nonprofit deemed “educational” content.

    The individual algorithms additionally recommended descriptions of illegal substance use, and sexually explicit posts involving trendy, coded language slipped through the filters. But not all protections faltered, including the platform’s built-in restrictions on messaging and tagging.

    Mashable Light Speed

    “These findings suggest that Meta has not independently fostered a safe online environment, as it purports to parents and lawmakers,” the report writes. “Lawmakers should compel Meta to produce data about Teen Accounts so that regulators and nonprofits can understand over time whether teenagers are actually protected when using Instagram.”

    In a response to the Washington Post, Meta spokeswoman Liza Crenshaw said the test’s limited scope doesn’t capture the true impact of the app’s safety features. “A manufactured report does not change the fact that tens of millions of teens now have a safer experience thanks to Instagram Teen Accounts. The report is flawed, but even taken at face value, it identified just 61 pieces of content that it deems ‘sensitive,’ less than 0.3 percent of all of the content these researchers would have likely seen during the test.”

    Addressing an ongoing, platform-wide issue

    A June 2024 experiment by the Wall Street Journal and Northeastern University found that minor-aged accounts were frequently recommended sexually explicit and graphic content within the app’s video-centered Reels feed, despite being automatically set to the platform’s strictest content settings. The phenomenon was a known algorithmic issue for parent company Meta, which, according to internal documents, was flagged by employees conducting safety reviews as early as 2021. In a response, Instagram representatives said the experiments did not reflect the reality of how young users interact with the app.

    At that time, Instagram had yet to launch its new tentpole product, Teen Accounts, introduced as a new, more highly monitored way for younger users to exist and post online — including stronger content controls. Minor users are automatically placed into Teen Accounts when signing up for Instagram, which sets their page private, limits messaging capabilities and the ability to stream live, and filters out sensitive content from feeds and DMs. Teens between the ages of 13-15 have even tighter reins on their app usage, and accounts that fall through the cracks are now being spotted and flagged by Meta’s in-house AI.

    More than 54 million teens have been moved into a restricted Teen Account since the initial rollout, according to Meta, and the vast majority of users under the age of 16 have kept the default, stricter security settings. And while the numbers show a positive shift, even Meta CEO Mark Zuckerberg admits there may be limits to how effectively the company can monitor its vast user base and complex algorithm.

    Chase sits in front of a green framed window, wearing a cheetah print shirt and looking to her right. On the window's glass pane reads

    Chase joined Mashable’s Social Good team in 2020, covering online stories about digital activism, climate justice, accessibility, and media representation. Her work also captures how these conversations manifest in politics, popular culture, and fandom. Sometimes she’s very funny.


    These newsletters may contain advertising, deals, or affiliate links. By clicking Subscribe, you confirm you are 16+ and agree to our Terms of Use and Privacy Policy.

    Related

    Leave a Reply

    Please enter your comment!
    Please enter your name here