The short version: Online Safety Act for games
I’ve read thousands of pages of legislation and statements from Ofcom to produce this super-condensed guide for games in relation to the Online Safety Act and Ofcom’s regulatory framework.
This blog post is a massive oversimplification to provide the shortest version possible. I have also written a full in-depth guide which I encourage you to read. Link here.
The 10 things you need to know
1 - What is the OSA?
The UK Government passed some legislation into law in 2023 called the Online Safety Act. It’s designed to make the internet safer for users within the United Kingdom, specifically children.
It aims to regulate two types of content: Illegal content and harmful content. The goal is to protect children from these across online services. We cover the types of content specifically in more detail in a few sections.
Ofcom were chosen by the UK Gov as the independent regulator to ensure the OSA is enacted and enforced. Ofcom have released their final guidance on the OSA (April 24th 2025), and they’ve set the deadline for all services in scope to be compliant by July 24th 2025.
2 - This impacts games globally.
The Online Safety Act impacts any and all service providers globally that have links to the United Kingdom. Those “links” are defined as:
You are a “regulated user-to-user service if your game has links with the United Kingdom (4.2.a,), which means that users from the United Kingdom are a target market for your game, children from the United Kingdom are likely to play your game, or your game is capable of being used in the United Kingdom by children (4.6.a), or your game might be found appealing by children within the UK.
So it doesn’t matter where you’re headquartered or where your game is developed - if the UK is a target market for your game, you have children (under 18s) in the UK that play your game, or your game is appealing to children, then you need to be compliant.
3 - Key dates and the work you need to complete
Children’s Access Assessments - Needed to be done January 2025
Illegal Content Safety Duties - Needed to be done March 2025
Children’s Risk Assessment - Needed to be done April 2025
Comply with Children’s Safety Duties - Due July 25th at the latest
Also, here are links to helpful and important info that you need to be aware of:
The Online Safety Act 2023 (the legislation from the UK Gov),
Ofcom’s statements on protecting children from harms online (with links to all 6 volumes and regulatory documents & guidance).
4 - Types of content that are regulated
You have to ensure children are protected from the following types of content:
Illegal content and illegal harms:
Offences to children including CSEA (Child Sexual Exploitation Abuse), CSAM (Child Sexual Abuse Material), and Grooming.
Threats (including hate).
Abuse & insults (including hate).
Harmful content - Primary Priority Content (PPC)
Pornographic content,
Suicide content (which encourages, promotes, or provides instructions for suicide),
Self-injury content (encourages, promotes, or provides instructions for an act of deliberate self-injury),
Harmful content - Priority Content (PC)
Content which is abusive or incites hatred based on characteristics,
Violent content (including fictional creatures),
Bullying content.
There are many more categories, but these are the most relevant to the gaming industry (as directly highlighted by Ofcom themselves). I strongly recommend you read our full guide on the Online Safety Act where we go into full detail on each type of content.
5 - Risk factors
Service type (gaming),
User base
Age is the big bit here. Not just having children on a service, but the distribution of ages and age bands are very important factors too.
Functionalities
In-game user-to-user communication is the primary one for games. Specifically direct messaging and voice chat, but also group chat functionality.
6 - Ofcom’s codes of practice and recommended measures
There’s loads of stuff you have to do here. Read our full guide for all the details.
The short version is that there are lots of codes of practice that you have to adhere to. There are many codes per section, and the sections include:
Governance and accountability for your service,
Age assurance (highly effective age assurance),
Content moderation,
Reporting and complaints,
Settings, functionalities, and user support,
User controls,
Terms of service.
As I say above - read our full guide on what you have to do here.
7 - Gaming specific codes
Almost all games will be considered a PCU B5 (link here).
This means that you will have to add highly effective age assurance to your game to gate users from functionalities that may cause risk of exposure to regulated content (illegal content, PPC, PC, or NDC).
In almost all cases, games will have to block access to user-to-user communication until the HEAA process has been completed by the user.
8 - Highly effective age assurance (HEAA)
This is a big topic - read our full guide for the real details.
The short version is that you have to either: KYC, age estimate, or use a reusable digital identity service to gate functionality that can put children at risk.
Read our full guide for a much better answer, but this post is supposed to be a super-short version.
9 - Gating functionality that can expose children to harmful or illegal content with HEAA
As touched on above, you have a legal responsibility to protect children from illegal and harmful content, and both the Online Safety Act and Ofcom have deemed that HEAA is the way to do this.
So, you have to implement it and you have to gate users behind functionality or content that is within scope (user communication in the case of games).
10 - More changes are coming; you need to keep up to date.
Ofcom have stated that they’re bringing further guidance this summer (2025) which includes more regulation on grooming, abuse to women and girls, and other topics too.
Additionally, this is coming to Australia later this year, and to EU member states in the coming months and years.
This isn’t a one-off tick box exercise; this is something you will have to stay up to date on and iterate as the rules change and new regions/territories introduce their own versions.
The fine for non-compliance is £18m or 10% of global turnover, whichever is greater. So don’t get it wrong.
Summary:
Games are regulated and you need to be complaint,
Under 18’s need to be protected from illegal content, illegal harms, and harmful content. To do this, you have to use highly effective age assurance to keep children away from functionalities that may put them at risk.
In games, this is almost exclusively down to user-to-user communication (messaging, voice chat, etc),
Also in games, almost all of the impacted content types comes from toxic players.
This was a mega-condensed short version on the topic.
I’ve also written a resource which goes into way more detail, but still specifically focused on games. This includes all key quotes, links, and context for what games need to know to stay compliant. I strongly recommend you read that. It’s longer than this, but much shorter than reading thousands of pages from Ofcom and the Online Safety Act directly.