This article is from the source 'bbc' and was first published or seen on . The next check for changes will be

You can find the current article at its original source at https://www.bbc.co.uk/news/technology-68225707

The article has changed 8 times. There is an RSS feed of changes available.

Version 5 Version 6
What the Online Safety Act is - and how to keep children safe online What the Online Safety Act is - and how to keep children safe online
(2 months later)
Technology companies will have to take more action to keep children in the UK safe on the internet, following the introduction of the Online Safety Act. Tech firms will have to do more to protect young people from harmful content under new safety measures announced by the media regulator.
The new rules come in during 2025 - but critics say they do not go far enough. Ofcom's own research found that 59% of 13 to 17-year olds surveyed had seen "potentially harmful content" online in the previous month.
What does the Online Safety Act mean for children?
As part of implementing the Online Safety Act, the regulator has finalised a series of child safety rules which will come into force for social media, search and gaming apps and websites on 25 July 2025.
Ofgem says the rules will prevent young people from encountering the most harmful content relating to suicide, self-harm, eating disorders and pornography.
They are also designed to protect children from misogynistic, violent, hateful or abusive material, online bullying and dangerous challenges.
Firms which wish to continue operating in the UK must adopt more than 40 practical measures, including:
changing the algorithms which determine what is shown in children's feeds to filter out harmful content
implementing stricter age verification methods to check whether a user is under 18
removing identified harmful material more quickly, and support children who have been exposed to it
identifying a named person in their company who is "accountable for children's safety", and annually review how they are managing risk to children on their platforms
Failure to comply could result in businesses being fined £18m or 10% of their global revenues, or their executives being jailed.
In very serious cases Ofcom says it can apply for a court order to prevent the site or app from being available in the UK.
Why has the Online Safety Act been criticised?
A number of campaigners want to see even stricter rules for tech firms, and some want under-16s banned from social media completely.
Ian Russell, chairman of the Molly Rose Foundation - which was set up in memory of his daughter who took her own life aged 14 - said he was "dismayed by the lack of ambition" in the codes.
Molly Russell took her own life in 2017 after being exposed to suicide and self-harm content on Instagram and Pinterest
The Duke and Duchess of Sussex are also calling for stronger protection from the dangers of social media, saying "enough is not being done".
They unveiled a temporary memorial in New York City dedicated to children who have died due to the harms of the internet. "We want to make sure that things are changed so that... no more kids are lost to social media," Prince Harry told BBC Breakfast.
The NSPCC children's charity argues that the law still doesn't provide enough protection for private messaging apps. It says that the end-to-end encrypted services which they offer "continue to pose an unacceptable, major risk to children".
On the other side, privacy campaigners complain the new rules threaten users' freedom.
Some also argue age verification methods are invasive without being effective enough. Digital age checks can lead to "security breaches, privacy intrusion, errors, digital exclusion and censorship," according to Silkie Carlo, director of Big Brother Watch.
What else is in the Online Safety Act?
The bill also requires firms to show they are committed to removing illegal content, including:
child sexual abuse
controlling or coercive behaviour
extreme sexual violence
promoting suicide or self-harm
selling illegal drugs or weapons
terrorism
The Act has also created new offences, such as:
cyber-flashing - sending unsolicited sexual imagery online
sharing "deepfake" pornography, where artificial intelligence is used to insert someone's likeness into pornographic content
How much time do UK children spend online?How much time do UK children spend online?
Children aged eight to 17 spend between two and five hours online per day, research by the communications regulator Ofcom suggests, external. Time spent online increases with age. Children aged eight to 17 spend between two and five hours online per day, according to Ofcom research, external.
Nearly every child over 12 has a mobile phone and almost all of them watch videos on platforms such as YouTube or TikTok. It found that nearly every child over 12 has a mobile phone and almost all of them watch videos on platforms such as YouTube or TikTok.
Four in five teenagers who go online say they have used artificial intelligence (AI) tools such as ChatGPT or Snapchat's MyAI.
About half of children over 12 think being online is good for their mental health, according to Ofcom, external.About half of children over 12 think being online is good for their mental health, according to Ofcom, external.
But one in eight children aged eight to 17 have said someone had been nasty or hurtful to them on social media, or messaging apps. However, the Children's Commissioner said that half of the 13-year-olds her team surveyed reported seeing "hardcore, misogynistic" pornographic material on social media sites. Children also said material about suicide self-harm and eating disorders was "prolific" and that violent content was "unavoidable".
The Children's Commissioner said that half of the 13-year-olds her team surveyed reported seeing "hardcore, misogynistic" pornographic material on social media sites.
What online parental controls are available?What online parental controls are available?
Two-thirds of parents say they use controls to limit what their children see online, according to Internet Matters -a safety organisation set up by some of the big UK-based internet companies. The NSPCC says it's vital that parents talk to their children about internet safety and take an interest in what they do online, external.
Two-thirds of parents say they use controls to limit what their children see online, according to Internet Matters, a safety organisation set up by some of the big UK-based internet companies.
It has a list of parental controls available and step-by-step guides, external on how to use them.It has a list of parental controls available and step-by-step guides, external on how to use them.
On YouTube - the most popular platform for young people in the UK - parents who want to try to prevent their children seeing unsuitable material can set up the "kids" version, which filters out adult content. These include advice on how to manage teen or child accounts on social media, video platforms such as YouTube, and gaming platforms such as Roblox or Fortnite.
Parents can also set up supervised accounts, external to review the content that older children using YouTube's main site can find and watch. However Ofcom data suggests that about one in five children are able to disable parental controls.
Supervision can also be set up on Facebook Messenger, via its Family Centre, external. Instagram does not let 13 to 15-year-old users make their account public unless they add a parent or guardian to their Teen Account
Its parent company Meta provides parental controls across its social media apps such as daily time limits, scheduled break times and information the content their child interacts with. Instagram has already introduced "teen accounts" which turn on many privacy settings by default - although some researchers have claimed they were able to circumvent the promised protections.
Instagram, also owned by Meta, has introduced teen accounts for under-18s whichsets them to private by default. Ofcom: Guide for parents, external
Instagram does not let 13 to 15 year old users make their account public unless they add a parent or guardian to their Teen Account. Keep kids off Roblox if you're worried, its CEO tells parents
Similarly, under-13s on Roblox must get parental consent in order to have private, in-game conversations. What controls are there on mobile phones and gaming consoles?
TikTok says its family pairing, external tool lets parents decide whether to make a teenager's account private. Phone and broadband networks may block some explicit websites until a user has demonstrated they are over 18.
But such controls are not fool-proof. Ofcom data has suggested that about one in 20 children uses workarounds.
'It's so easy to lie': A fifth of children use fake age on social media
What controls are there on mobile phones and consoles?
Phone networks may block some explicit websites until a user has demonstrated they are over 18.
Some also have parental controls that can limit the websites children can visit on their phones.Some also have parental controls that can limit the websites children can visit on their phones.
Android and Apple phones and tablets also have controls for parents. Android, external and Apple, external devices also offer options for parents to block or limit access to specific apps, restrict explicit content, prevent purchases and monitor browsing.
These can block or limit access to specific apps, restrict explicit content, prevent purchases and monitor browsing.
Apple has Child Accounts, external and Google has Family Link, external. There are similar apps available from third-party developers.
Later in 2025, Apple says it will let parents share their child's age range - rather than their date of birth - linked to their child's account with app developers, to help them provide age-appropriate experiences.
Broadband services also have parental controls to filter certain types of content.
Game console controls also let parents ensure age-appropriate gaming and control in-game purchases, external.Game console controls also let parents ensure age-appropriate gaming and control in-game purchases, external.
How should you talk to your children about online safety? Parents can limit purchases and access to age-restricted games in Nintendo Switch consoles
Talking to children about online safety and taking an interest in what they do online is important, external, according to the NSPCC.
It recommends making discussions about it part of daily conversation, just like a chat about their day at school, which can make it easier for children to share any concerns they have.
What are the UK's child safety rules for tech companies?
The Online Safety Act aims to make social media firms and search engines protect children and adults in the UK from illegal, harmful material.
It became law in 2023, with duties for platforms coming into effect in 2025.
The Act will require platforms to show they are committed to removing illegal content, including:
Child sexual abuse
Controlling or coercive behaviour
Extreme sexual violence
Promoting or facilitating suicide or self-harm
Animal cruelty
Selling illegal drugs or weapons
Terrorism
Pornography sites will have to stop children viewing content, by checking ages.
Duties to protect children from harmful content, also include addressing harms disproportionately affecting women and girls, such as intimate image abuse and harassment.
The Act has also created new offences, such as:
Cyber-flashing - sending unsolicited sexual imagery online
Sharing "deepfake" pornography, where artificial intelligence is used to insert someone's likeness into pornographic content
It also makes it easier for bereaved parents to obtain information about their children from technology companies.
Ofcom, the regulator tasked with enforcing the Act, has been given additional powers to ensure companies comply with the rules.
It requires online platforms to assess if and where users - particularly children - may be exposed to certain types of illegal or harmful content on their services.
Platforms must then detail measures to prevent this, in accordance with Ofcom's codes and guidance.
Ofcom's chief executive Dame Melanie Dawes has warned that services failing to follow the rules could have their minimum user age raised to 18.
Why has the Online Safety Act been criticised?
Some parents of children who died after exposure to harmful online content have called the Online Safety Act "insufficient" and criticised its delays.
Bereaved parents including Ian Russell, father of Molly, and Esther Ghey, mother of Brianna, have said the legislation should impose tougher rules and a duty of care on tech firms.
Esther Ghey wants technology companies to make it harder for young people to access potentially harmful material online
In August 2024, following riots across the UK, London Mayor Sadiq Khan said the Act, external did not adequately tackle misinformation.
Some civil liberties groups meanwhile believe its content measures risk stifling free expression online.
Requirements for porn sites to use technology for "robust age checks" have also raised privacy and security concerns.
The Online Safety Act is one year old. Has it made children any safer?
Can we really 'reset the internet' to make it safer for children?