This article is from the source 'bbc' and was first published or seen on . The next check for changes will be

You can find the current article at its original source at https://www.bbc.co.uk/news/technology-68225707

The article has changed 8 times. There is an RSS feed of changes available.

Version 0 Version 1
How can you keep children safe online and what parental controls are there? What is the Online Safety Act and how can you keep children safe online?
(3 months later)
The question of how to keep children safe online is never far from parents' minds, but has come into increasing focus following the murder of 16-year-old Brianna Ghey. Technology companies will have to take more action to keep children safe on the internet, following the introduction of the Online Safety Act.
Scarlett Jenkinson and Eddie Ratcliffe, who killed Brianna when they were 15, plotted the murder using messaging apps. Jenkinson had watched videos of violence and torture on the dark web. But the new rules will not come in until 2025 - and critics say they do not go far enough.
Brianna's mother, Esther, says the government should make it harder for young people to access potentially harmful material online.
But what steps can parents take now to make their children's digital lives as safe as possible?
How much time do UK children spend online?How much time do UK children spend online?
Children aged eight to 17 spend between two and five hours online per day, research by the communications regulator Ofcom suggests. Time spent online increases with age.Children aged eight to 17 spend between two and five hours online per day, research by the communications regulator Ofcom suggests. Time spent online increases with age.
Nearly every child over 12 has a mobile phone and almost all of them watch videos on platforms such as YouTube or TikTok.Nearly every child over 12 has a mobile phone and almost all of them watch videos on platforms such as YouTube or TikTok.
Four in five teenagers who go online say they have used AI tools such as ChatGPT or Snapchat's MyAI. Four in five teenagers who go online say they have used artificial intelligence (AI) tools such as ChatGPT or Snapchat's MyAI.
About half of children over 12 think being online is good for their mental health, according to Ofcom.About half of children over 12 think being online is good for their mental health, according to Ofcom.
But there is a significant minority for whom that is not the case. One in eight children aged eight to 17 said someone had been nasty or hurtful to them on social media, or messaging apps.But there is a significant minority for whom that is not the case. One in eight children aged eight to 17 said someone had been nasty or hurtful to them on social media, or messaging apps.
The Children's Commissioner said half of the 13-year-olds her team surveyed reported seeing "hardcore, misogynistic" pornographic material on social media sites. The Children's Commissioner said that half of the 13-year-olds her team surveyed reported seeing "hardcore, misogynistic" pornographic material on social media sites.
Online Act not strong enough, Brianna's mum says What online parental controls are available?
Molly Russell coroner urges social media changes Two-thirds of parents say they use controls to limit what their children see online, according to Internet Matters, a safety organisation set up by some of the big UK-based internet companies.
What parental controls are available online?
Parents concerned about what their children are seeing online can take some simple practical steps.
The most obvious is to learn about control functions, which two-thirds of parents say they use, according to Internet Matters, a safety organisation set up by some of the big UK-based internet companies.
It has a list of parental controls available and step-by-step guides on how to use them.It has a list of parental controls available and step-by-step guides on how to use them.
The way these work varies. However, those available with YouTube - the most popular platform for young people in the UK - are a good indication of the kinds of tools available. For example, parents who want to reduce the likelihood of their children seeing unsuitable material on YouTube - the most popular platform for young people in the UK - can set up the "kids" version, which filters out adult content.
Parents who want to reduce the likelihood of their children seeing unsuitable material can set them up with the "kids" version of YouTube, which filters out adult content. For older children using the main site, parents can set up supervised accounts, which let them review the sites their children visit.
Or, if their children are older and want to use the main site, adults can make their accounts supervised. This means they are able to review what sites they have visited.
Supervision can also be set up on Facebook messenger, via its Family Centre.Supervision can also be set up on Facebook messenger, via its Family Centre.
Instagram launches new parental controls in UK TikTok says its family pairing tool lets parents decide whether to make a teenager's account private.
Instagram's parental controls include daily time limits, scheduled break times and can list accounts their children have reported.
But these controls are not fool-proof. Ofcom data suggests about one in 20 children uses workarounds.
Zuckerberg apologises to familiesZuckerberg apologises to families
And TikTok says its family pairing tool enables parents to decide if a teenager's account is private or public.
Of course, these controls are not fool-proof.
Some adults find setting them up confusing, and some children find ways of getting round them. Ofcom data suggests about one in 20 children use workarounds.
What controls are there on mobile phones and consoles?What controls are there on mobile phones and consoles?
Phone networks may block some explicit websites until a user has demonstrated they are over 18.Phone networks may block some explicit websites until a user has demonstrated they are over 18.
Some also have parental controls that can limit the websites children can visit on their phones.Some also have parental controls that can limit the websites children can visit on their phones.
Android and Apple phones and tablets have apps and systems parents can use.Android and Apple phones and tablets have apps and systems parents can use.
These can block or limit access to specific apps, restrict explicit content, prevent purchases and monitor browsing.These can block or limit access to specific apps, restrict explicit content, prevent purchases and monitor browsing.
Apple has Screen Time and Google has Family Link. There are similar apps available from third-party developers.Apple has Screen Time and Google has Family Link. There are similar apps available from third-party developers.
Broadband services also have parental controls to filter certain types of content.Broadband services also have parental controls to filter certain types of content.
So do games consoles - allowing parents to ensure age-appropriate gaming and control in-game purchases. Game console controls also let parents ensure age-appropriate gaming and control in-game purchases.
What else can parents do to keep kids safe? How should you talk to your children about online safety?
Talking to children about online safety and being interested in what they do online is also important, charities including the NSPCC say. Talking to children about online safety and being interested in what they do online is also important, according to the NSPCC.
Making discussions about it part of daily conversation, just like a chat about their day at school, can make children comfortable with the topic. It recommends making discussions about it part of daily conversation, just like a chat about their day at school, which can make it easier for children to share any concerns they have.
It can also make children more likely to share any concerns they have. What are the new rules for technology companies?
How will new laws help protect children? The government says the Online Safety Act - due to come into force in the second half of 2025 - puts the onus on social media firms and search engines to protect children from some legal-but-harmful material.
The government says its new law, the Online Safety Act, will force social media firms and search engines to do more to protect users, particularly children. Platforms will also have to show they are committed to removing illegal content, including:
It seeks to force tech firms to take more responsibility for the content on their platforms. child sexual abuse
However, the new legislation will take some time before it is fully enforced. controlling or coercive behaviour
And there have been recent calls for more to be done. extreme sexual violence
Esther Ghey says the Online Safety Act does not go far enough and children must be stopped from having access to social media apps. promoting or facilitating suicide or self-harm
She also wants parents to be able to download software to alert them to worrying content their children might be searching for. animal cruelty
Esther Ghey wants a law introduced so under 16s cannot access social media on their phones selling illegal drugs or weapons
Esther Ghey wants a law introduced so under 16s cannot access social media on their phones terrorism
Conservative MP Miriam Cates has called on Prime Minister Rishi Sunak to consider banning under 16s from social media and smartphones. Pornography sites will have to stop children viewing content, by checking ages.
However Mr Sunak has said that the Online Safety Act "protects children from harmful or inappropriate" material. Other new offences have been created, including:
cyber-flashing - sending unsolicited sexual imagery online
sharing "deepfake" pornography, where artificial intelligence is used to insert someone's likeness into pornographic content
The act also makes it easier for bereaved parents to obtain information about their children from technology companies.
The regulator Ofcom has been given extra enforcement powers to ensure companies comply with the new rules and has published draft codes for them to follow.
It says the companies must reconfigure the algorithms deciding which content users see, to ensure the most harmful material does not appear in children's feeds and reduce the visibility and prominence of other damaging content.
Chief executive Dame Melanie Dawes warned any company failing to follow the rules could have their minimum user age raised to 18.
And Technology Secretary Michelle Donelan urged big tech to take the codes seriously:
"Engage with us and prepare," she said.
"Do not wait for enforcement and hefty fines - step up to meet your responsibilities and act now."
LIVE: Grieving parents say social media costs lives
What have critics said about the new rules?
Some parents of children who died after exposure to harmful online content have called the new rules "insufficient" and criticised the delay before they come into force.
Esther Ghey wants technology companies to make it harder for young people to access potentially harmful material online
Ian Russell, father of Mollie, and Esther Ghey, mother of Brianna, are part of a group of bereaved parents that signed an open letter to Prime Minister Rishi Mr Sunak and Leader of the Opposition Sir Keir Starmer, calling for more action.
They want a commitment to strengthen the Online Safety Act in the first half of the next Parliament and mental health and suicide prevention added to the school curriculum.
"While we will study Ofcom's latest proposals carefully, we have so far been disappointed by their lack of ambition," they add in the letter.
Esther Ghey says Online Safety Act not robust enough
In her own words - Molly Russell's secret Twitter account
What have technology companies said about the new rules?
Meta and Snapchat said they already had extra protections for under-18s and highlighted their existing parental tools.
"As a platform popular with young people, we know we have additional responsibilities to create a safe and positive experience," a Snapchat representative said.
A Meta representative said it wanted young people "to connect with others in an environment where they feel safe".
"Content that incites violence, encourages suicide, self-injury or eating disorders breaks our rules - and we remove that content when we find it," they said.
A number of other technology companies contacted by BBC News declined to respond to the draft measures.
Related TopicsRelated Topics
Online Safety BillOnline Safety Bill
Social mediaSocial media
Internet privacyInternet privacy
Mobile phonesMobile phones
Young people
ParentingParenting