People under 18 could be banned from using social media apps including TikTok, Instagram, and Snapchat if the tech firms donât follow new Ofcom rules to protect children.
But what is Ofcom and why does it want to change what people see online? Hereâs a quick guide to get you up to speed.
Ofcomâs draft guidelines follow the governmentâs Online Safety Act, which aims to make technology companies take more responsibility for childrenâs safety online.
Social media users have to be over 13, but Ofcom says children regularly see content promoting self-harm, pornography or violence on social media. This is banned by the biggest social networks like TikTok and Instagram, but increasing number of children have been hurt or died after seeing harmful content online in recent years.
If tech firms donât change what children see, Ofcom says it will ban children under 18 from using social media altogether.
Ofcom is short for the Office of Communications. It makes sure companies and businesses such as TV channels, radio stations, postal services, social media apps and online platforms are treating people fairly, legally and safely.
Ofcom has published more than 40 guidelines it says sites and apps should follow to keep children safe online.
These include bringing in new ways to check how old users are, changing what young users see, removing harmful content more effectively, and helping users report it.
“Young people are fed harmful content on their feed again and again and this has become normalised but it needs to change,â said Ofcom boss Dame Melanie Dawes, taking aim at âtoxicâ social media algorithms.
People post billions of messages, pictures and videos on social media every day, so companies like TikTok and Instagram use software called an algorithm to decide what users see.
It recommends content based on usersâ past behaviour on the app, their age and gender, or other information.
Ofcom says the new rules will be a reality in the second half of 2025. But experts believe it will be very difficult to make sure both users and social media companies are following the rules, and some people are concerned the new rules could affect usersâ privacy or limit free speech.
There are also questions about how verifying someoneâs age will work, such as whether users will need to give their photo ID.
Snapchat and Meta, which owns Facebook, Instagram and WhatsApp, have both released statements saying they had extra protections for users who are under 18 and help parents control what their children see.
But many companies have not responded to the BBCâs requests for comment.
The families of 12 children whose deaths have been linked to harmful online content spoke to the BBC after Ofcom published the rules.
They said the rules didnât go far enough to protect children and change happened too slowly, and another group of parents complained that Ofcom did not do enough to listen to parents.
More than anything, these campaigners are worried that more children will die if social media companies and the government donât do more.
French bakers beat longest baguette world record. VideoFrench bakers beat longest baguette world record
The true story of the dealers, doctors and the billion dollar drug that fuelled America’s deadly opioid addiction.
Watch the second series of the hit crime drama exclusively on BBC iPlayer
From its controversial beginnings in Italy to a battle against bad manners in France
© 2024 BBC. The BBC is not responsible for the content of external sites. Read about our approach to external linking.