Categories: विज्ञान

Exclusive-Instagram shows more ‘eating disorder adjacent’ content to vulnerable teens, internal Meta research shows

By Jeff Horwitz NEW YORK (Reuters) -Meta researchers found that teens who report that Instagram regularly made them feel bad about their bodies saw significantly more “eating disorder adjacent content” than those who did not, according to an internal document reviewed by Reuters. The posts shown to those users featured “prominent display” of chest, buttocks, or thighs, “explicit judgement” about body types and “content related to disordered eating and/or negative body image.”  While such material is not banned on Instagram, the researchers noted that parents, teens and outside experts have told Meta they believe it is potentially harmful to young users.  Meta surveyed 1,149 teens over the course of the 2023-2024 academic year about whether and how often they felt bad about their bodies after using Instagram. Then they manually sampled the content those users saw on the platform over a three-month period. The study showed that for the 223 teens who often felt bad about their bodies after viewing Instagram, “eating disorder adjacent content” made up 10.5% of what they saw on the platform. Among the other teens in the study, such content accounted for only 3.3% of what they saw.  “Teens who reported frequent body dissatisfaction after viewing posts on Instagram… saw about three times more body-focused/ED-adjacent content than other teens,” the authors wrote, referring to eating disorders, according to a summary of the research reviewed exclusively by Reuters. In addition to seeing more eating disorder-adjacent content, researchers found, the teens who reported the most negative feelings about themselves saw more provocative content more broadly, content Meta classifies as “mature themes,”  “Risky behavior,” “Harm & Cruelty” and “Suffering.” Cumulatively, such content accounted for 27% of what those teens saw on the platform, compared with 13.6% among their peers who hadn’t reported negative feelings.  META RESEARCHERS EXPRESSED CONCERN The researchers stressed that their findings did not prove that Instagram was making users feel worse about their bodies. “It is not possible to establish the causal direction of these findings,” they wrote, noting the possibility that teens who felt bad about themselves could be actively seeking out that material.  The Instagram research, a redacted replica of which is published here and has not been previously reported, demonstrated that Instagram exposed teens who “report frequently experiencing body dissatisfaction" to high doses of content that Meta’s own advisors “have expressed support for limiting,” the writeup said. In a statement, Meta spokesperson Andy Stone said the document reviewed by Reuters demonstrates Meta’s commitment to understanding and improving its products.  “This research is further proof we remain committed to understanding young people’s experiences and using those insights to build safer, more supportive platforms for teens,” Stone said, noting the company’s recent announcement that it would attempt to show minors content in-line with PG-13 movie standards.   In the study, Meta said that its existing screening tools – which were designed to catch violations of platform rules – were incapable of detecting 98.5% of the “sensitive” content that the company believes might not have been appropriate for teens. The finding was “not necessarily surprising,” the researchers wrote, because Meta had only recently begun work on building an algorithm to detect the potentially harmful content they were examining. 'ASSOCIATION' BETWEEN CONTENT AND 'FEELING WORSE ABOUT ONE'S BODY' The study, which is marked, “Do not distribute internally or externally without permission,” is the latest internal research that demonstrates an “association” between viewing fashion, beauty and fitness content and “reporting feeling worse about one’s body,” the document states. The research was conducted as part of Meta’s efforts to understand interactions between users and its own products, which rely on algorithms to determine which content to show to users.  In the United States, the company has faced state and federal investigations of Instagram’s effects on children as well as civil suits by school districts alleging harmful product design and deceptive marketing of its platforms as safe for teens.  Those suits have prominently cited past leaked internal research from Meta in which researchers expressed their belief that the platform’s content recommendations might be harmful to youth with existing body image issues.  Since July of this year, Stone said, the company has reduced the amount of age-restricted content shown to teenage Instagram users by half.  CONTENT WARRANTED TRIGGER WARNING Jenny Radesky, a University of Michigan Associate Professor of Pediatrics who reviewed Meta’s unreleased research at Reuters’ request, called the study’s methodology robust and its findings disturbing. “This supports the idea that teens with psychological vulnerabilities are being profiled by Instagram and fed more harmful content,” Radesky said. “We know that a lot of what people consume on social media comes from the feed, not from search.” Past internal research at Meta “has demonstrated an association between frequency of reporting feeling worse about one’s body” and consuming Instagram fitness and beauty content, the writeup says.  Meta’s researchers wrote that teens, parents, pediatricians, external advisors and Meta’s own Eating Disorder & Body Image Advisory Council have urged Instagram to limit how much of that content it shows to teenagers, warning that it “may be detrimental to teen well-being, specifically by precipitating or exacerbating feelings of body dissatisfaction.” Such content falls afoul of Meta’s rules against content that overtly promotes eating disorders. The report included samples of problematic content including images of skinny women in lingerie and bikinis, fight videos and a drawing of a crying figure scrawled with phrases including “how could I ever compare” and “make it all end.” One of the example posts showed a closeup of a woman’s lacerated neck.  Though the content did not violate Meta’s rules, the researchers found it disturbing enough that they issued a “sensitive content” warning to colleagues reading their work. (Reporting by Jeff Horwitz in Oakland, Calif. Editing by Kenneth Li and Michael Learmonth)

(The article has been published through a syndicated feed. Except for the headline, the content has been published verbatim. Liability lies with original publisher.)

Inkhabar webdesk

Share
Published by Inkhabar webdesk

Recent Posts

Amazon's AWS struggles to recover after major outage disrupts apps, services worldwide

By Deborah Mary Sophia and Shubham Kalia (Reuters) -Amazon's cloud services unit AWS was struggling…

1 minute ago

Amazon's AWS struggles to recover after major outage disrupts apps, services worldwide

By Deborah Mary Sophia and Shubham Kalia (Reuters) -Amazon's cloud services unit AWS was struggling…

2 minutes ago

How engine shortages sent almost-new Airbus jets to the scrapyard

By Tim Hepher and Allison Lampert CASTELLON DE LA PLANA, Spain (Reuters) -At Castellon airport…

12 minutes ago

Netflix's ad, gaming bets in focus as investors seek clarity on pay-off

By Harshita Mary Varghese and Kritika Lamba (Reuters) -Netflix's $120 billion stock market rally this…

13 minutes ago

Netflix's ad, gaming bets in focus as investors seek clarity on pay-off

By Harshita Mary Varghese and Kritika Lamba (Reuters) -Netflix's $120 billion stock market rally this…

22 minutes ago

Netflix's ad, gaming bets in focus as investors seek clarity on pay-off

By Harshita Mary Varghese and Kritika Lamba (Reuters) -Netflix's $120 billion stock market rally this…

24 minutes ago