YouTube is to limit recommendations of certain health and fitness videos to teenagers, including those which "idealise" certain body types.
It says 13 to 17-year-old users will still be able to search for and view fitness-related content - but will not be encouraged into repeated viewing of similar videos.
YouTube says it is acting because of concerns that repeated exposure to such material can lead young people to develop "negative beliefs" about themselves.
Experts have welcome the measure but say it needs to be accompanied by a "broader discussion" about fitness and health for young people.
YouTube's algorithm will usually recommend similar content for users to watch once they have finished a particular video, as well as displaying related videos on a sidebar.
The platform says this will no longer be offered for teens when they view certain types of content, including:
YouTube said the measures were being taken after its Youth and Familes Advisory Committee found that "teens are more likely than adults to form negative beliefs about themselves when seeing repeated messages about ideal standards in content they consume online."
However, the restrictions on what videos are offered will only be possible if the user is logged in to a YouTube account - and if they have registered an accurate date of birth.
The platform has no means of verifying the age users claim to be.
Dr Petya Eckler, a senior lecturer at University of Strathclyde who studies the relationship between body image and social media, said she welcomed the announcement given "the link between use of social media by young people and perceptions of their bodies."
But she told the BBC more needed to be done.
"This should go hand in hand with a broader discussion of fitness and health within families and the idea that exercise is a great way to enhance our overall health and wellbeing and should not be done only for appearance reasons."
YouTube has also announced new ways for parents to keep track of their children’s activities on the platform.
Parents will be able link their accounts with teenagers in their household in order to see their uploads, subscriptions and comments, and receive emails when they upload videos or start livestreams.
In May, Ofcom told tech firms to reformulate their algorithms to steer children away from what it called "toxic" material.