This article is from the source 'guardian' and was first published or seen on . The next check for changes will be

You can find the current article at its original source at https://www.theguardian.com/technology/2025/jul/24/uk-should-act-to-stop-children-social-media-addiction-dopamine-loops

The article has changed 6 times. There is an RSS feed of changes available.

Version 0 Version 1
UK should act to stop children getting hooked on social media ‘dopamine loops’ UK should act to stop children getting hooked on social media ‘dopamine loops’
(30 minutes later)
Peer Beeban Kidron says it is not ‘nanny state’ to stop firms investing billions on making their platforms addictive from targeting under-18sPeer Beeban Kidron says it is not ‘nanny state’ to stop firms investing billions on making their platforms addictive from targeting under-18s
The UK government has been urged to “detoxify” the “dopamine loops” of addictive social media platforms by a leading online safety campaigner, as tech companies prepare to implement significant child protection measures. A leading online safety campaigner has urged the UK government to “detoxify the dopamine loops” of addictive social media platforms as tech companies prepare to implement significant child protection measures.
Beeban Kidron, a crossbench peer, urged the technology secretary, Peter Kyle, to use the Online Safety Act to bring forward new codes of conduct on disinformation and tech features that can lead to children becoming addicted to online content. Beeban Kidron, a crossbench peer, asked the technology secretary, Peter Kyle, to use the Online Safety Act to bring forward new codes of conduct on disinformation and on tech features that can lead to children becoming addicted to online content.
“The secretary of state has a power under the Online Safety Act to bring forward new codes of conduct,” said Kidron. “We have urgently asked him to do so, but so far we have been rebuffed.”“The secretary of state has a power under the Online Safety Act to bring forward new codes of conduct,” said Kidron. “We have urgently asked him to do so, but so far we have been rebuffed.”
Kidron added it was not “nanny state” to prevent companies that invest billions of pounds into making their platforms as addictive as possible from targeting under-18s. “It is up to ministers to use their powers to detoxify those dopamine loops – they have the power – so why not to do so right now?” Kidron added it was not “nanny state” behaviour to prevent companies that invest billions in making their platforms addictive from targeting under-18s. “It is up to ministers to use their powers to detoxify those dopamine loops – they have the power – so why not to do so right now?”
“Dopamine-like” measures identified by 5Rights, a campaign group founded by Kidron, include displaying the amount of times a user’s post has been liked or shared, mobile phone notifications and showing content with expiry dates such as Instagram’s stories feature.“Dopamine-like” measures identified by 5Rights, a campaign group founded by Kidron, include displaying the amount of times a user’s post has been liked or shared, mobile phone notifications and showing content with expiry dates such as Instagram’s stories feature.
Kidron spoke to the Guardian before the 25 July deadline for online platforms – including Facebook, Instagram, TikTok, YouTube, X and Google – to introduce child safety measures, and for pornography sites to bring in stringent age-checking.Kidron spoke to the Guardian before the 25 July deadline for online platforms – including Facebook, Instagram, TikTok, YouTube, X and Google – to introduce child safety measures, and for pornography sites to bring in stringent age-checking.
Age-checking measures could also be required for social media sites that allow harmful content, such as X, which is the most popular source of pornography for young people, according to research published by the children’s commissioner for England, Dame Rachel de Souza. X announced on Thursday that if it was unable to determine whether a user was 18 or over, they would be defaulted into sensitive content settings and would not be able to view adult material.Age-checking measures could also be required for social media sites that allow harmful content, such as X, which is the most popular source of pornography for young people, according to research published by the children’s commissioner for England, Dame Rachel de Souza. X announced on Thursday that if it was unable to determine whether a user was 18 or over, they would be defaulted into sensitive content settings and would not be able to view adult material.
Dame Melanie Dawes, Ofcom’s chief executive, said: “Prioritising clicks and engagement over children’s online safety will no longer be tolerated in the UK. Our message to tech firms is clear – comply with age-checks and other protection measures set out in our codes, or face the consequences of enforcement action from Ofcom.”Dame Melanie Dawes, Ofcom’s chief executive, said: “Prioritising clicks and engagement over children’s online safety will no longer be tolerated in the UK. Our message to tech firms is clear – comply with age-checks and other protection measures set out in our codes, or face the consequences of enforcement action from Ofcom.”
The changes mean that social media companies must, as a priority, prevent children from seeing pornography and harmful content that encourages suicide, self-harm or eating disorders. They must also suppress the spread of harmful content, such as violent, hateful or abusive material and online bullying.The changes mean that social media companies must, as a priority, prevent children from seeing pornography and harmful content that encourages suicide, self-harm or eating disorders. They must also suppress the spread of harmful content, such as violent, hateful or abusive material and online bullying.
Companies that breach the act face fines of up to 10% of global turnover which in the case of Instagram’s parent company, Meta, would amount to $16.5bn. In extreme cases, sites or apps could be blocked in the UK. Tech executives could also be prosecuted if they ignored Ofcom demands to comply with child safety duties.Companies that breach the act face fines of up to 10% of global turnover which in the case of Instagram’s parent company, Meta, would amount to $16.5bn. In extreme cases, sites or apps could be blocked in the UK. Tech executives could also be prosecuted if they ignored Ofcom demands to comply with child safety duties.
Ofcom has outlined a series of measures that comply with the child safety requirements. Those include: sites and apps having procedures for taking down dangerous content quickly; children having a “straightforward” way to report harmful content; and algorithms, which recommend content to users, filtering out harmful material.Ofcom has outlined a series of measures that comply with the child safety requirements. Those include: sites and apps having procedures for taking down dangerous content quickly; children having a “straightforward” way to report harmful content; and algorithms, which recommend content to users, filtering out harmful material.
X gave details of its age-checking measures on Thursday, including that if a user has previously indicated that they are under 18 or if the account was created in 2012 or earlier. Bluesky, Discord, Grindr and Reddit have also committed to age-gating measures. Ofcom will assess whether these approaches comply with the act.X gave details of its age-checking measures on Thursday, including that if a user has previously indicated that they are under 18 or if the account was created in 2012 or earlier. Bluesky, Discord, Grindr and Reddit have also committed to age-gating measures. Ofcom will assess whether these approaches comply with the act.
Meta, the owner of Instagram and Facebook, says it has a multilayered approach in place that complies to age-checking requirements, including its teenager account feature – a default setting for users under 18 – that it says already provide an “age appropriate” experience for young users. TikTok, which argues it already prohibits the vast majority of content prohibited to children, is introducing new age-checking measures for certain restricted material from Friday.Meta, the owner of Instagram and Facebook, says it has a multilayered approach in place that complies to age-checking requirements, including its teenager account feature – a default setting for users under 18 – that it says already provide an “age appropriate” experience for young users. TikTok, which argues it already prohibits the vast majority of content prohibited to children, is introducing new age-checking measures for certain restricted material from Friday.
Pornography providers such as Pornhub have committed to introducing stringent age checks from Friday. Measures recommended by Ofcom include: facial age estimation, where technology assesses a person’s likely age through a live photo or video; checking a person’s age via their credit card provider, bank or mobile phone network operator; photo ID matching, where a passport or similar ID is checked against a selfie; or a “digital identity wallet” that contains proof of age.Pornography providers such as Pornhub have committed to introducing stringent age checks from Friday. Measures recommended by Ofcom include: facial age estimation, where technology assesses a person’s likely age through a live photo or video; checking a person’s age via their credit card provider, bank or mobile phone network operator; photo ID matching, where a passport or similar ID is checked against a selfie; or a “digital identity wallet” that contains proof of age.