Instagram – New research on Instagram’s algorithms revealed that the social media platform is pushing questionable weight-loss content toward users who even briefly interact with fitness-related content online.
The study, called Pathways, was commissioned by the 5Rights Foundation charity, which campaigns for tighter online controls for children. Revealing Reality is an entrepreneurial, multidisciplinary social research agency and was also involved.
The research included the “mystery shopper” technique whereby various fake Instagram profiles, or avatars, were created to mimic real accounts held by children and teenagers. The fake accounts would follow the same pages as real-life volunteer teenagers.
The avatars would like some posts to determine the speed at which Instagram’s algorithm would push possibly damaging material into the “explore” page on the platform.
The study found that when teenage girls engage — even just briefly — with content related to dieting, for instance, the explore tab on Instagram would immediately begin displaying photos of pre- and post-weight loss journeys, along with content related to weight loss tips, exercise, and body sculpting.
The material often featured “noticeably slim” and, in some cases, seemingly edited or distorted body shapes.
The research found similar results when recreating the experience for teenage girls and boys. The study replicated the behavior of a real 14-year-old boy, which eventually led to his Instagram explore tab being flooded with pictures of heavily-edited and retouched models.
A spokesperson for Facebook, which owns Instagram, said the platform was already taking steps to keep teens safe on the social media platform, including preventing adults from sending direct messages to teens who do not follow them.
However, the inherent design of the recommendation algorithms used by social networks, such as Instagram, can exacerbate social issues for teenagers.
“Especially considering that just one ‘like’ can generate thousands of related content,” Lady Beeban Kidron, who chairs the 5Rights Foundation charity, said.
In a podcast, Alex Cooney, the CEO and co-founder of Cyber Safe Kids, highlights that social media platforms are not deliberately trying to cause harm. But their business models are designed to optimize maximum engagement, which is leading to harmful consequences.
Cooney also said there are thousands of accounts for children under the age of 13, who are using fake ages to register and also being subjected to this flood of potentially damaging content.
Starting in September, social media companies in the UK will be forced to comply with stringent new rules that are effectively aimed at creating a safe digital environment for children as companies will be required to present a child-friendly version of their service by default.