
The focus of this article is the Online Safety Act 2023. Due to its recent enactment, there is little research on its effectiveness. However, it is an important topic to explore when considering the media influence on young people and their vulnerability to criminal activity. The passing of the Online Safety Act demonstrates the Government’s awareness of the internet’s role in shaping crime patterns, such as knife carrying, amongst young people. This edition of the magazine focuses on how early investment in young people can help reduce crime rates and therefore save public money due to lower spending on expensive rehabilitation of young people. Because of the current lack of evidence on the effectiveness of the Act as a preventative measure, this article will have a future facing outlook. I will explore how the Act currently keeps young people safe online and how effective investments could support the future of young people’s safety online. Overall, I will discuss how effective investments into safe spaces for young people online could help keep many away from harmful trends such as knife carrying.
Why there is a need for initiatives that protect young people online
The ‘Draft Statement of Strategic Priorities for Online Safety’ states that young people are becoming increasingly present online.1 This along with many published statistics about young people’s behaviour online shows how these spaces can have a large impact on their behaviour in real life. In 2023, 96% of 21 surveyed 3- to 17-year-olds had gone online; furthermore, by the age of 11, 9 in 10 of these young people owned a phone.2 Half of the children aged 3 to 12 have at least one social media app despite the minimum age of 13. However, the most important statistic from this study was one on the topic of media literacy, which I will return to throughout this article. In the study, young people aged 12 to 17 were shown a mock up social media profile and were then asked to identify if the profile was real or not. 69% of them correctly identified it as fake, but 16% thought it was genuine and 15% were not sure.3 This shows how even though the online space is ever evolving and becoming increasingly accessed by young people, young people’s media literacy is not necessarily improving. Furthermore, an online consultation with 3,975 young people identified via Childline and NetAware found that 30% of respondents reported having seen violent or hateful content online.4 To further support the point that online moderation is necessary for young people today, a mixed methods study reported within Ofcom’s report on children’s media use and attitudes shows that social media can act as a catalyst and trigger for serious incidents of face-to-face violence between young people.5 Another Ofcom report states that ‘[they were] concerned about how online content can be used to encourage violence using weapons’ and that online spaces allow young people to normalise and glamorise carrying knives.6
To quote Internet Matters, ‘this milestone [the Online Safety Act] matters because the risks children face online remain high. Our latest [survey] shows that 3 in 4 children aged 9 to 17 experience harm online from exposure to violent content to unwanted contact from strangers’.7 Overall, these statistics highlight an ever-growing online landscape that plays a vital role in young people’s lives, highlighting the necessity of safe online moderation.
What is the Online Safety Act?
The Online Safety Act is arguably one of the biggest online safety initiatives introduced so far in the UK. It appoints Ofcom as the enforcer of the Act and sets out many goals for the initiative including many new laws surrounding the online space. It has been stated that a priority content concern is material which depicts or encourages serious violence or injury which clearly targets offences to do with weapons such as knives.8 Furthermore, for all content deemed inappropriate for those under the age of 18, assurance technology is utilised to limit what content underage users can see. The government has stated many times that protecting young people is the heart of the Online Safety Act.9 They have also marketed this act as a preventative measure, stating that moderation is supposed to stop young people from seeing harmful content altogether. Ofcom have been given the power to fine companies a proportion of their income if they do not comply with their moderation standards and in some cases can even bring criminal action against senior managers.10 Currently, Ofcom has an ongoing case towards the online messaging board site 4chan. This includes a fine of £20,000 that they expect 4chan to pay. Furthermore, there is a daily fine of £100 for either 60 days or until 4chan provides Ofcom with the requested information.11 However, a key question is whether large companies with lots of revenue will really be affected by these fines or whether they will just accept the fine and continue to operate as they were.
How Ofcom will react to such insouciant commercial standoffs remains to be seen.
The final important thing to mention when giving a broad overview of the Online Safety Act is that services will be required to publish an annual report on online safety related information such as their algorithms and what content it causes users to see.12 This is particularly important to ensure the protection of young people online.
How does the Online Safety Act currently set out to protect young people online?
The Online Safety Act designed a set of moderation guidelines for companies that set out to make their platforms safer by design. One of the main things this safer by design interface sets out to do, is to make sure that all report buttons are easy to find, use and understand.13 Furthermore, following a report, platforms should direct the user to any relevant helplines or charities.14 Any illegal harms reported to a service, should be reported to law enforcement, along with the details.15 The guidelines further state that when creating platform safety features all users should be considered, especially those who have low media literacy16. As noted above, young people often lack media literacy so by design this should make platforms consider young people in their approach to content moderation and reporting. Ofcom are also performing their own illegal content and harm checks by ensuring that automatic content checks occur before content is uploaded. This is important as we don’t know how many young people (especially those already involved in harmful spaces) will utilise the platform’s report features.
These content checks and automatic moderation systems are targeting many types of content, but one of the vital types they are targeting is content related to the sale of knives and other offensive weapons, aiming to reduce the ease of obtaining these items17. Finally, one very important clarification from this Act is that parents and carers are able to request information from services following a young person’s death.18 This request allows coroners to access data that will help clarify how online activity may have contributed to the death of the young person.
Despite these advances, there are still many gaps within the Act and key criticisms suggest that the protections do not sufficiently consider young people.
Faults and gaps within the Online Safety Act
One glaring fault within the Online Safety Act is that Ofcom is said to have a small task force. Considering the vastness of the online space, a small task force is unlikely to be able to moderate many spaces.19 Ofcom states that they want to keep up with evidence on the impact, prevalence, and types of content that affect young people of different ages, especially keeping up with harmful trends (such as knife carrying), additional pressure on the small taskforce. This kind of large-scale content moderation and surveillance would not only take a lot of oversight by Ofcom but would also require strong collaboration between online applications, Ofcom, and law enforcement. When it comes to content moderation, Ofcom states they are going to continue to monitor progress and be evidence driven20. This shows hope for the future as they assess what is effective at stopping young people from accessing harmful content. There are a couple of things that could make content moderation and blocking even harder, such as end-to-end encryption and Virtual Private Networks (VPNs). End-to-end encryption allows people to private message without outside moderation and is commonly used by messaging apps such as WhatsApp. This means that harmful spaces, many of which include young people, could continue to operate unchecked. VPNs also pose an issue to the age verification proposal; this is because they allow a person to access a website as if they are operating from a different region, which allows the user to access the website without any age verification21. VPNs are easy to access and navigate with many of them being free or obtainable for a small cost, this allows for an easy way to access moderated content. However, it is important to highlight that although VPNs are legal within the UK, since the Act passed advertisements for and content pushing VPNs is being restricted.22 The article ‘Access denied: the UK online safety act misses its mark’ highlights that this is one of the biggest threats to the effectiveness of the Online Safety Act.23 They suggest that the more you restrict internet access, the more determined and inventive young people get to find a way around it. This concept is especially prevalent when trying to moderate trends amongst young people, as trends do not just spread online but also via the influence of peers. When young people find a way to access harmful content (such as spaces that sell offensive weapons or promote knife crime) the more young people there are that can tell peers how to access this content. Overall, this section highlights how difficult it can be to monitor young people’s online activities. It is also important to note that online spaces are only one method for promoting harmful trends amongst young people. Funding initiatives for online safety can take a holistic approach to incorporate positive messages and reduce involvement in crime overall.
Future initiatives and developments to keep young people safe online
Although these changes are already underway, many questions remain about how effective they are for creating a safe online space for young people. This section will propose ideas for more effective policies for online safety. The first thing to discuss is the proposed update to the National Curriculum.24 The National Curriculum has not been updated in a decade, despite significant developments in the online space and the increasing engagement of young people, making an update long overdue. It is important to mention that 16- to 19-year-olds are not included in the National Curriculum, but their education has been reviewed as well. The changes set to happen to the National Curriculum are as follows; boosting critical skills which means an increased focus on media and digital literacy and improved computer education, including a Computing GCSE and a possible Level 3 qualification in Data Science and Artificial Intelligence (AI). However, this curriculum change is still nascent, so we do not yet understand the significance of these changes.
‘Policy and Rights Challenges in Children’s Online Behaviour and Safety’.25
The overarching focus of Phippen Andy’s book is on the idea that young people want help, support, and education, not saviours.26 This highlights young peoples’ need for support from adults in their community, rather than large companies and regulators far away and detached from their lives. They want help that is specific to their lives and experiences, which such bodies cannot provide.27 This support should be provided within communities from people such as teachers, as they understand the area that a young person lives in and the experiences felt by them. Furthermore, many of their peers will share experiences, making the classroom a great place to provide this education and support. It is also surely true that effective support cannot take a ‘one size fits all’ approach, as something that affects one young person might not affect another.28 This makes the National Curriculum change more significant, as it can be personalised to a young persons’ experience unlike the Online Safety Act, which takes a broader approach. This shows that an effective moderation policy should consider the roles of family, the education system, social services, and broader societal influences on a young person as the current policy is oversimplified and assumes that the same form of harmful content affects all young people.29 Effective moderation should also include empowering young people with the knowledge they need to navigate the online space effectively; this would avoid the erasure of the positives that the online space can provide.30 Furthermore, this book presents the idea that although these moderation tactics are marketed as preventative, they are actually reactionary as they are being developed at a time that the online space is already making a huge impact on young people’s lives.31 The author also makes a comparison between online policies and the evolving drug policy landscape, stating that a shift in online policies to education, awareness, and support could yield more sustainable outcomes, such as those seen in drug policies and awareness schemes.32 This further shows how a change to the current curriculum could make a huge impact on the effectiveness of online safety policies. Finally, the most important point raised in this book is that the media portrayal of safety issues focuses on the most severe scenarios creating a distorted view of the prevalence and nature of online harms.33 This results in a policy that is not aimed at the everyday experience of many young people online, creating an overly restrictive policy that ignores the many positives the online space can provide.
Overall, this article has outlined how current online safety policies bring some benefit to young people. However, for the biggest impact we need increased investment into young people’s support systems and education around the current digital landscape. Young people need to be given the knowledge to navigate these spaces alone and need much greater investment of time and money into efforts to promote their safety. These investments have some hope of coming to fruition as a new curriculum is already in consideration and with the influence of young people and their close support systems, it could make a huge impact. On a global scale, different Governments are presenting ideas for keeping young people safe online, such as Australia's recent social media ban for under-16. This suggests any future changes to the Act are uncertain as policy transfer steadily influences the development of the online safety landscape globally. Finally, investment into online safety with a focus on context specific approaches that targets outside factors could be hugely effective in preventing online spaces from causing more young people to be impacted by harmful trends.

Further Reading
[1] Department for Science, Innovation & Technology, ‘Draft Statement of Strategic Priorities for Online Safety’ (gov.uk, 2nd July 2025) https://www.gov.uk/government/publications/statement-of-strategic-priorities-for-online-safety/statement-of-strategic-priorities-for-online-safety
[2] Ofcom, ‘Children and parents: media use and attitudes report 2024’ (Ofcom.org, 19th April 2024) https://www.ofcom.org.uk/siteassets/resources/documents/research-and-data/media-literacy-research/children/children-media-use-and-attitudes-2024/childrens-media-literacy-report-2024.pdf?v=368229
[3] Ofcom, ‘Children and parents: media use and attitudes report 2024’ (Ofcom.org, 19th April 2024) https://www.ofcom.org.uk/siteassets/resources/documents/research-and-data/media-literacy-research/children/children-media-use-and-attitudes-2024/childrens-media-literacy-report-2024.pdf?v=368229
[4] Myers, C.A & Hudson, N, ‘Content and activity that is harmful to children within scope of the Online Safety Bill. A Rapid Evidence Assessment.’ (natcen.ac.uk, 27th May 2022) https://natcen.ac.uk/publications/content-and-activity-harmful-children-within-scope-online-safety-bill#:~:text=About%20the%20study&text=The%20review%20focused%20on%20harmful,give%20rise%20to%20eating%20disorders.
[5] Myers, C.A & Hudson, N, ‘Content and activity that is harmful to children within scope of the Online Safety Bill. A Rapid Evidence Assessment.’ (natcen.ac.uk, 27th May 2022) https://natcen.ac.uk/publications/content-and-activity-harmful-children-within-scope-online-safety-bill#:~:text=About%20the%20study&text=The%20review%20focused%20on%20harmful,give%20rise%20to%20eating%20disorders.
[6] Ofcom, ‘How the Online Safety Act will help to tackle knife crime’ (Ofcom.org, 20th May 2025) https://www.ofcom.org.uk/online-safety/protecting-children/how-the-online-safety-act-will-help-to-tackle-knife-crime
[7] Department for Science, Innovation and Technology, ‘Keeping children safe online: changes to the Online Safety Act explained’ (gov.uk, 1st August 2025) https://www.gov.uk/government/news/keeping-children-safe-online-changes-to-the-online-safety-act-explained
[8] Department for Science, Innovation & Technology, ‘Online Safety Act: explainer’ (gov.uk, 24th April 2025) https://www.gov.uk/government/publications/online-safety-act-explainer/online-safety-act-explainer
[9] Department for Science, Innovation & Technology, ‘Online Safety Act: explainer’ (gov.uk, 24th April 2025) https://www.gov.uk/government/publications/online-safety-act-explainer/online-safety-act-explainer
[10] Department for Science, Innovation & Technology, ‘Online Safety Act: explainer’ (gov.uk, 24th April 2025) https://www.gov.uk/government/publications/online-safety-act-explainer/online-safety-act-explainer
[11] Ofcom, ‘Ofcom issues update on Online Safety Act investigations’ (Ofcom.org, 13th October 2025) https://www.ofcom.org.uk/online-safety/illegal-and-harmful-content/ofcom-issues-update-on-online-safety-act-investigations
[12] Department for Science, Innovation & Technology, ‘Online Safety Act: explainer’ (gov.uk, 24th April 2025) https://www.gov.uk/government/publications/online-safety-act-explainer/online-safety-act-explainer
[13] Department for Science, Innovation and Technology ‘Live streaming: improve the safety of your online platform’ (gov.uk, 29th June 2021) https://www.gov.uk/guidance/live-streaming-improve-the-safety-of-your-online-platform
[14] Department for Science, Innovation and Technology ‘Live streaming: improve the safety of your online platform’ (gov.uk, 29th June 2021) https://www.gov.uk/guidance/live-streaming-improve-the-safety-of-your-online-platform
[15] Department for Science, Innovation and Technology ‘Understanding and reporting online harms on your online platform’ (gov.uk, 29th June 2021) https://www.gov.uk/guidance/understanding-and-reporting-online-harms-on-your-online-platform
[16] Department for Science, Innovation and Technology ‘Principles of safer online platform design’ (gov.uk, 29th June 2021) https://www.gov.uk/guidance/principles-of-safer-online-platform-design
[17] Ofcom, ‘How the Online Safety Act will help to tackle knife crime’ (Ofcom.org, 20th May 2025) https://www.ofcom.org.uk/online-safety/protecting-children/how-the-online-safety-act-will-help-to-tackle-knife-crime
[18] Department for Science, Innovation & Technology, ‘Draft Statement of Strategic Priorities for Online Safety’ (gov.uk, 2nd July 2025) https://www.gov.uk/government/publications/statement-of-strategic-priorities-for-online-safety/statement-of-strategic-priorities-for-online-safety
[19] Department for Science, Innovation & Technology, ‘Online Safety Act: explainer’ (gov.uk, 24th April 2025) https://www.gov.uk/government/publications/online-safety-act-explainer/online-safety-act-explainer
[20] Department for Science, Innovation & Technology, ‘Draft Statement of Strategic Priorities for Online Safety’ (gov.uk, 2nd July 2025) https://www.gov.uk/government/publications/statement-of-strategic-priorities-for-online-safety/statement-of-strategic-priorities-for-online-safety
[21 ] Elly Rostom, ‘Access Denied: The UK Online Safety Act Misses Its Mark’ (cepa.org, August 18th 2025) https://cepa.org/article/access-denied-the-uk-online-safety-act-misses-its-mark/
[22] Department for Science, Innovation and Technology, ‘Keeping children safe online: changes to the Online Safety Act explained’ (gov.uk, 1st August 2025) https://www.gov.uk/government/news/keeping-children-safe-online-changes-to-the-online-safety-act-explained
[23] Elly Rostom, ‘Access Denied: The UK Online Safety Act Misses Its Mark’ (cepa.org, August 18th 2025) https://cepa.org/article/access-denied-the-uk-online-safety-act-misses-its-mark/
[24] Department for Education, ‘What you need to know about the changes to the National Curriculum’ (educationhub.blog.gov, 5th November 2025) https://educationhub.blog.gov.uk/2025/11/what-you-need-to-know-about-the-changes-to-the-national-curriculum/
[25] Phippen Andy, Policy and Rights Challenges in Children’s Online Behaviour and Safety (2017 – 2023), p.149-163 https://link.springer.com/book/10.1007/978-3-031-80286-7#:~:text=About%20this%20book,claim%20to%20wish%20to%20protect.
[26] Phippen Andy, Policy and Rights Challenges in Children’s Online Behaviour and Safety (2017 – 2023), p.149-163 https://link.springer.com/book/10.1007/978-3-031-80286-7#:~:text=About%20this%20book,claim%20to%20wish%20to%20protect.
[27] Phippen Andy, Policy and Rights Challenges in Children’s Online Behaviour and Safety (2017 – 2023), p.149-163 https://link.springer.com/book/10.1007/978-3-031-80286-7#:~:text=About%20this%20book,claim%20to%20wish%20to%20protect.
[28] Phippen Andy, Policy and Rights Challenges in Children’s Online Behaviour and Safety (2017 – 2023), p.149-163 https://link.springer.com/book/10.1007/978-3-031-80286-7#:~:text=About%20this%20book,claim%20to%20wish%20to%20protect.
[29] Phippen Andy, Policy and Rights Challenges in Children’s Online Behaviour and Safety (2017 – 2023), p.149-163 https://link.springer.com/book/10.1007/978-3-031-80286-7#:~:text=About%20this%20book,claim%20to%20wish%20to%20protect.
[30] Phippen Andy, Policy and Rights Challenges in Children’s Online Behaviour and Safety (2017 – 2023), p.149-163 https://link.springer.com/book/10.1007/978-3-031-80286-7#:~:text=About%20this%20book,claim%20to%20wish%20to%20protect.
[31] Phippen Andy, Policy and Rights Challenges in Children’s Online Behaviour and Safety (2017 – 2023), p.149-163 https://link.springer.com/book/10.1007/978-3-031-80286-7#:~:text=About%20this%20book,claim%20to%20wish%20to%20protect.
[32] Phippen Andy, Policy and Rights Challenges in Children’s Online Behaviour and Safety (2017 – 2023), p.149-163 https://link.springer.com/book/10.1007/978-3-031-80286-7#:~:text=About%20this%20book,claim%20to%20wish%20to%20protect.