Smartphones in schools
Following the announcement of a new Safer Phones Bill calling for smartphones to be banned in schools in England by law, Lucy Blake and William R Weaver, partners at law firm Jenner & Block, offer their analysis
A Labour MP and former teacher, Josh MacAlister, has introduced a new private member’s bill – the Safer Phones Bill – which has now been discussed in Parliament. The bill calls for parental consent to be required before platforms can obtain data from children aged under 16 – a move designed to exclude children from personalised algorithms and addictive content. The bill also calls for a legal ban on phones in school and to give Ofcom further enhanced powers to protect children’s interests, as well as committing to a review of the sale of phones to children aged under 16.
The bill is intended to strengthen the existing Online Safety Act, passed by the UK government in the summer last year. The Act imposes compliance duties on in-scope platforms to make them more responsible for users’ safety, in particular to prevent children from accessing harmful and age-inappropriate content. The UK regulator Ofcom is still in the process of consulting on compliance guidance for in-scope companies so most provisions of the law have yet to take effect. The new Labour government has promised to implement a full review of the new law, following criticism regarding its efficacy in tackling disinformation which played a role in the riots that took place across the UK over the summer. The new Bill amplifies the pressure on the UK government to take action, focusing in particular on children’s online safety.
Policing the internet, however, requires a global effort and the UK is not alone in passing and considering new legislation. The EU adopted the Digital Services Act in 2022 which imposes compliance and transparency obligations on in-scope platforms designed to protect against the spread of illegal content and protect users’ fundamental rights.
Over the summer this year, the US Senate passed the Kids Online Safety Act (KOSA), which intends to create a duty of care requiring platforms to take reasonable steps to prevent harm for minor users, including limiting addictive features and offering the ability to opt out of personalised algorithmic recommendations. The US House of Representatives Energy and Commerce Committee advanced a version of KOSA out of committee in September, but the House version differs significantly from the Senate Bill. Furthermore, the Republican leadership in the House has continued to express concerns with KOSA, thus the House Bill may get stalled over partisan (and Republican intra-party) disagreement about some of the Bill’s provisions.
Even so, child online safety will continue to be a topic of interest and proposed legislation in the 119th Congress that begins in January. Some states have also passed important child online safety legislation, including California’s recent passage of a law designed to curb social media addiction among minors.
With countries around the world looking to legislate to tackle online harms, consistency will be key. In the past month, the UK and US governments formed a joint government working group on child online safety, which represents a renewed commitment by the governments of two of the world leaders in tech ecosystems to protect children online.
The working group’s aim to develop “common solutions, shared principles and global standards” is to be welcomed, particularly as global platforms will look to develop consistent systems and controls around the world. However, while noble in their intentions, the online safety laws in the UK, EU, US (and elsewhere) have been criticised for a lack of clarity and specificity as to the obligations on platforms, leading to a risk that companies may limit free speech by over-moderating.
Content moderation and platform governance decisions require a complex and fine balance between users’ rights to free speech, privacy and safety. As well as consistency, the working group should look to provide tangible and actionable direction to help global companies navigate their legal requirements, so they can work with the governments to ensure “a more secure digital world for young people”.

Luck Blake