This article is from the source 'bbc' and was first published or seen on . The next check for changes will be

You can find the current article at its original source at https://www.bbc.co.uk/news/technology-68225707

The article has changed 8 times. There is an RSS feed of changes available.

Version 4 Version 5
What is the Online Safety Act and how can you keep children safe online? What the Online Safety Act is - and how to keep children safe online
(5 months later)
Technology companies will have to take more action to keep children safe on the internet, following the introduction of the Online Safety Act. Technology companies will have to take more action to keep children in the UK safe on the internet, following the introduction of the Online Safety Act.
But the new rules will not come in until 2025 - and critics say they do not go far enough. The new rules come in during 2025 - but critics say they do not go far enough.
How much time do UK children spend online?How much time do UK children spend online?
Children aged eight to 17 spend between two and five hours online per day, research by the communications regulator Ofcom suggests, external. Time spent online increases with age.Children aged eight to 17 spend between two and five hours online per day, research by the communications regulator Ofcom suggests, external. Time spent online increases with age.
Nearly every child over 12 has a mobile phone and almost all of them watch videos on platforms such as YouTube or TikTok.Nearly every child over 12 has a mobile phone and almost all of them watch videos on platforms such as YouTube or TikTok.
Four in five teenagers who go online say they have used artificial intelligence (AI) tools such as ChatGPT or Snapchat's MyAI.Four in five teenagers who go online say they have used artificial intelligence (AI) tools such as ChatGPT or Snapchat's MyAI.
About half of children over 12 think being online is good for their mental health, external, according to Ofcom. About half of children over 12 think being online is good for their mental health, according to Ofcom, external.
But there is a significant minority for whom that is not the case. One in eight children aged eight to 17 said someone had been nasty or hurtful to them on social media, or messaging apps. But one in eight children aged eight to 17 have said someone had been nasty or hurtful to them on social media, or messaging apps.
The Children's Commissioner said that half of the 13-year-olds her team surveyed reported seeing "hardcore, misogynistic" pornographic material on social media sites.The Children's Commissioner said that half of the 13-year-olds her team surveyed reported seeing "hardcore, misogynistic" pornographic material on social media sites.
What online parental controls are available?What online parental controls are available?
Two-thirds of parents say they use controls to limit what their children see online, according to Internet Matters, a safety organisation set up by some of the big UK-based internet companies. Two-thirds of parents say they use controls to limit what their children see online, according to Internet Matters -a safety organisation set up by some of the big UK-based internet companies.
It has a list of parental controls available and step-by-step guides, external on how to use them.It has a list of parental controls available and step-by-step guides, external on how to use them.
For example, parents who want to reduce the likelihood of their children seeing unsuitable material on YouTube - the most popular platform for young people in the UK - can set up the "kids" version, which filters out adult content. On YouTube - the most popular platform for young people in the UK - parents who want to try to prevent their children seeing unsuitable material can set up the "kids" version, which filters out adult content.
For older children using the main site, parents can set up supervised accounts, external, which let them review the sites their children visit. Parents can also set up supervised accounts, external to review the content that older children using YouTube's main site can find and watch.
Supervision can also be set up on Facebook messenger, via its Family Centre, external. Supervision can also be set up on Facebook Messenger, via its Family Centre, external.
Its parent company Meta provides parental controls across its social media apps such as daily time limits, scheduled break times and information the content their child interacts with.
Instagram, also owned by Meta, has introduced teen accounts for under-18s whichsets them to private by default.
Instagram does not let 13 to 15 year old users make their account public unless they add a parent or guardian to their Teen Account.
Similarly, under-13s on Roblox must get parental consent in order to have private, in-game conversations.
TikTok says its family pairing, external tool lets parents decide whether to make a teenager's account private.TikTok says its family pairing, external tool lets parents decide whether to make a teenager's account private.
Instagram's parental controls include daily time limits, scheduled break times and can list accounts their children have reported. But such controls are not fool-proof. Ofcom data has suggested that about one in 20 children uses workarounds.
But these controls are not fool-proof. Ofcom data suggests about one in 20 children uses workarounds. 'It's so easy to lie': A fifth of children use fake age on social media
Zuckerberg apologises to families
What controls are there on mobile phones and consoles?What controls are there on mobile phones and consoles?
Phone networks may block some explicit websites until a user has demonstrated they are over 18.Phone networks may block some explicit websites until a user has demonstrated they are over 18.
Some also have parental controls that can limit the websites children can visit on their phones.Some also have parental controls that can limit the websites children can visit on their phones.
Android and Apple phones and tablets have apps and systems parents can use. Android and Apple phones and tablets also have controls for parents.
These can block or limit access to specific apps, restrict explicit content, prevent purchases and monitor browsing.These can block or limit access to specific apps, restrict explicit content, prevent purchases and monitor browsing.
Apple has Screen Time, external and Google has Family Link, external. There are similar apps available from third-party developers. Apple has Child Accounts, external and Google has Family Link, external. There are similar apps available from third-party developers.
Later in 2025, Apple says it will let parents share their child's age range - rather than their date of birth - linked to their child's account with app developers, to help them provide age-appropriate experiences.
Broadband services also have parental controls to filter certain types of content.Broadband services also have parental controls to filter certain types of content.
Game console controls also let parents ensure age-appropriate gaming and control in-game purchases, external.Game console controls also let parents ensure age-appropriate gaming and control in-game purchases, external.
How should you talk to your children about online safety?How should you talk to your children about online safety?
Talking to children about online safety and being interested in what they do online is also important, external, according to the NSPCC. Talking to children about online safety and taking an interest in what they do online is important, external, according to the NSPCC.
It recommends making discussions about it part of daily conversation, just like a chat about their day at school, which can make it easier for children to share any concerns they have.It recommends making discussions about it part of daily conversation, just like a chat about their day at school, which can make it easier for children to share any concerns they have.
What are the new rules for technology companies? What are the UK's child safety rules for tech companies?
The government says the Online Safety Act - due to come into force in the second half of 2025 - puts the onus on social media firms and search engines to protect children from some legal-but-harmful material. The Online Safety Act aims to make social media firms and search engines protect children and adults in the UK from illegal, harmful material.
Platforms will also have to show they are committed to removing illegal content, including: It became law in 2023, with duties for platforms coming into effect in 2025.
child sexual abuse The Act will require platforms to show they are committed to removing illegal content, including:
controlling or coercive behaviour Child sexual abuse
extreme sexual violence Controlling or coercive behaviour
promoting or facilitating suicide or self-harm Extreme sexual violence
animal cruelty Promoting or facilitating suicide or self-harm
selling illegal drugs or weapons Animal cruelty
terrorism Selling illegal drugs or weapons
Terrorism
Pornography sites will have to stop children viewing content, by checking ages.Pornography sites will have to stop children viewing content, by checking ages.
Other new offences have been created, including: Duties to protect children from harmful content, also include addressing harms disproportionately affecting women and girls, such as intimate image abuse and harassment.
cyber-flashing - sending unsolicited sexual imagery online The Act has also created new offences, such as:
sharing "deepfake" pornography, where artificial intelligence is used to insert someone's likeness into pornographic content Cyber-flashing - sending unsolicited sexual imagery online
The act also makes it easier for bereaved parents to obtain information about their children from technology companies. Sharing "deepfake" pornography, where artificial intelligence is used to insert someone's likeness into pornographic content
The regulator Ofcom has been given extra enforcement powers to ensure companies comply with the new rules and has published draft codes for them to follow. It also makes it easier for bereaved parents to obtain information about their children from technology companies.
It says the companies must reconfigure the algorithms deciding which content users see, to ensure the most harmful material does not appear in children's feeds and reduce the visibility and prominence of other damaging content. Ofcom, the regulator tasked with enforcing the Act, has been given additional powers to ensure companies comply with the rules.
Chief executive Dame Melanie Dawes warned any company failing to follow the rules could have their minimum user age raised to 18. It requires online platforms to assess if and where users - particularly children - may be exposed to certain types of illegal or harmful content on their services.
And Technology Secretary Michelle Donelan urged big tech to take the codes seriously: Platforms must then detail measures to prevent this, in accordance with Ofcom's codes and guidance.
"Engage with us and prepare," she said. Ofcom's chief executive Dame Melanie Dawes has warned that services failing to follow the rules could have their minimum user age raised to 18.
"Do not wait for enforcement and hefty fines - step up to meet your responsibilities and act now." Why has the Online Safety Act been criticised?
LIVE: Grieving parents say social media costs lives Some parents of children who died after exposure to harmful online content have called the Online Safety Act "insufficient" and criticised its delays.
What have critics said about the new rules? Bereaved parents including Ian Russell, father of Molly, and Esther Ghey, mother of Brianna, have said the legislation should impose tougher rules and a duty of care on tech firms.
Some parents of children who died after exposure to harmful online content have called the new rules "insufficient" and criticised the delay before they come into force.
Esther Ghey wants technology companies to make it harder for young people to access potentially harmful material onlineEsther Ghey wants technology companies to make it harder for young people to access potentially harmful material online
Ian Russell, father of Mollie, and Esther Ghey, mother of Brianna, are part of a group of bereaved parents that signed an open letter to Prime Minister Rishi Mr Sunak and Leader of the Opposition Sir Keir Starmer, calling for more action. In August 2024, following riots across the UK, London Mayor Sadiq Khan said the Act, external did not adequately tackle misinformation.
They want a commitment to strengthen the Online Safety Act in the first half of the next Parliament and mental health and suicide prevention added to the school curriculum. Some civil liberties groups meanwhile believe its content measures risk stifling free expression online.
"While we will study Ofcom's latest proposals carefully, we have so far been disappointed by their lack of ambition," they add in the letter. Requirements for porn sites to use technology for "robust age checks" have also raised privacy and security concerns.
Esther Ghey says Online Safety Act not robust enough The Online Safety Act is one year old. Has it made children any safer?
In her own words - Molly Russell's secret Twitter account Can we really 'reset the internet' to make it safer for children?
What have technology companies said about the new rules?
Meta and Snapchat said they already had extra protections for under-18s and highlighted their existing parental tools.
"As a platform popular with young people, we know we have additional responsibilities to create a safe and positive experience," a Snapchat representative said.
A Meta representative said it wanted young people "to connect with others in an environment where they feel safe".
"Content that incites violence, encourages suicide, self-injury or eating disorders breaks our rules - and we remove that content when we find it," they said.
A number of other technology companies contacted by BBC News declined to respond to the draft measures.