Nearly 20% of Young Teens Encounter Explicit Content on Instagram

5

A recent internal Meta survey reveals that 19% of Instagram users aged 13 to 15 have been exposed to unwanted sexually explicit images on the platform. The data emerged during a federal lawsuit in California, where Meta faces growing accusations of harming young users through addictive designs and contributing to a mental-health crisis.

Internal Meta Survey Data

The previously undisclosed figures come from a 2021 Meta user survey made public on Friday as part of ongoing legal proceedings. Instagram CEO Adam Mosseri acknowledged in a deposition that the company does not routinely share these results, citing the inherent unreliability of self-reported data.

The findings are particularly alarming given that Instagram’s own policies ban nudity and explicit content for teen users, with limited exceptions for educational or medical purposes. Despite this, nearly one in five young users report encountering such images, with most of these exposures occurring in private direct messages.

Company Response and Legal Pressure

Meta spokesperson Andy Stone insists the company is “proud of the progress” made in addressing harmful content. However, the data suggests that a significant proportion of teen users are still exposed to unwanted material.

The lawsuit highlights a broader trend of legal scrutiny facing Meta, with thousands of similar cases filed in U.S. courts and from global leaders who argue that the company prioritizes engagement over user safety.

Additional Disturbing Findings

The survey also revealed that roughly 8% of 13 to 15-year-old users have witnessed self-harm or suicide threats on Instagram. The platform’s moderation challenges are further complicated by privacy concerns; as Mosseri stated, many users would object to Meta actively monitoring their private conversations.

The data underscores the ongoing struggle to protect young users from harmful content online, despite the stated intentions of social media companies. The legal pressure on Meta may force further changes in content moderation and platform design to mitigate these risks.