close
close

Instagram apparently recommends sexual content for teenagers aged 13 and over – Firstpost

During the course of the investigation, investigators also came across videos containing nudity and, in one case, a series of videos depicting explicit and explicit sexual acts within minutes of the account being set up. Image credit: Pexels

Instagram is reportedly suggesting explicit reels to teenagers as young as 13, even if they are not actively looking for such content.

According to an investigative report by Laura Edelson, a professor at Northeastern University, and the Wall Street Journal, the Meta-owned social media platform has been proposing sexually explicit videos to teenagers. During the tests, which were conducted mainly between January and April of this year, both parties created new accounts with ages as young as 13 to study Instagram’s behavior.

The results show that immediately after the first login to the account, Instagram began suggesting moderately suggestive videos that showed content such as women dancing sensually or focusing on their bodies.
Accounts that engaged with these videos by watching them and skipping others soon began receiving recommendations for more explicit content.

Some of the recommended reels featured women miming sexual acts or offering to send nude photos in exchange for user comments. During the course of the investigation, investigators also came across videos containing nudity within minutes of the account being set up and, in one case, a series of videos of some explicit and graphic sexual acts.

Within just 20 minutes of the first interaction, the recommended Reels section was dominated by creators with sexual content.

In contrast, similar tests conducted on TikTok and Snapchat found no recommendations for sexual content for teen accounts created on those platforms.

Even after actively searching for age-inappropriate content and following known creators of such videos, neither TikTok nor Snapchat suggested such content to the test accounts.
The Wall Street Journal points out that Meta, Instagram’s parent company, had previously identified similar problems through internal investigations.

Despite these findings, Meta spokesperson Andy Stone dismissed the report, calling the tests “artificial experiments” that do not accurately reflect how teens use Instagram. He acknowledged Meta’s efforts to reduce the amount of sensitive content teens see, claiming that significant reductions have been achieved in recent months.

In January, Meta made major privacy updates to protect teen users by automatically placing them in the platform’s most restrictive control settings, from which they cannot be opted out.
Despite these measures, post-update testing by The Wall Street Journal was still able to reproduce worrying results in June. Meta had introduced these updates shortly after an earlier experiment by The Journal.

Latest news

Find us on YouTube

Subscribe to