Instagram Icon

Instagram Algorithm Recommends Sexualized Content To Adult Accounts Following Teen Influencers

Russian prosecutors asked a court to classify Facebook parent company Meta as “extremist” Friday, escalating tensions between Russia and the tech giant after Facebook was blocked in the country.
Meta Brands. File. By Kate Anderson

Instagram’s algorithms suggested a myriad of sexualized content to adult users who primarily follow preteen and teen influencer accounts, according to an investigation by The Wall Street Journal published Monday.

A June WSJ investigation revealed that Instagram’s algorithms enable child predators to connect with each other by promoting content to users with similar interests.

This time around, WSJ set up several adult accounts that followed young gymnasts, cheerleaders and influencers and found that Instagram’s algorithms suggested sexually explicit videos and questionable content with children.

Read: Battle For Iowa: Florida Gov. DeSantis, Former South Carolina Gov. Haley, And Former President Trump

The test accounts were also shown ads for companies like Walmart and Pizza Hut after videos of sexually explicit content, according to the WSJ. One Walmart ad was reportedly shown to a test account after a video of a woman exposing her genital area.

Many companies require that their advertising not run next to sexual or explicit content, according to the WSJ.

The test accounts also followed other users who followed similar accounts, which the WSJ found increased the risqué content that was suggested, including a video of a young clothed girl touching her torso and another of a child mimicking a sex act.

Ads also allegedly recommended dating apps, massage parlors with “happy endings” and chatbots for cybersex to the test accounts, according to the WSJ. An ad for the dating app Bumble appeared before a video of a person stroking a life-size latex doll and after another video of a girl, whose face was digitally obscured and could be seen lifting her shirt over her stomach.

A spokesperson for Bumble told the Daily Caller News Foundation that the company “would never intentionally advertise adjacent to inappropriate content” and that they had taken steps to ensure their ads were not “appearing in violation of our agreement with Meta.”

Read: Florida Man Charged In Kidnapping, Sexual Battery Of Tampa Teen He Met On Snapchat

“We take this matter very seriously and will continue to take every measure to ensure our brand advertising is consistent with our brand values,” the spokesperson said. “Bumble is a platform for people ages 18 and up. Bumble has a zero tolerance policy towards any form of child sexual exploitation and abuse.”

One ad reportedly urging users to visit Disneyland was followed by a video of a woman initiating sexual acts with her father, according to the WSJ. Another ad for the company Him, which provides erectile dysfunction medication, was seen not long after a reel of a woman in a sexual position with a link to the “full video.”

Meta told the DCNF that the results from WSJ’s investigation were a “manufactured experience” and that they do “not represent what billions of people around the world see every single day” on Instagram.

“We don’t want this kind of content on our platforms and brands don’t want their ads to appear next to it,” a Meta spokesperson said in a statement. “We continue to invest aggressively to stop it – and report every quarter on the prevalence of such content, which remains very low. Our systems are effective at reducing harmful content and we’ve invested billions in safety, security and brand suitability solutions.”

Disneyland, Walmart, Him and Pizza Hut did not immediately respond to the DCNF’s request for comment.

Android Users, Click To Download The Free Press App And Never Miss A Story. Follow Us On Facebook and Twitter. Signup for our free newsletter. 

We can’t do this without your help; visit our GiveSendGo page and donate any dollar amount; every penny helps

Login To Facebook To Comment