The Online Safety Act 2023 is a new set of laws to protect children and adults online. It's supposed to impose new safeguarding responsibilities on social media companies and search engines.
META, Google, and the rest, will be required to prevent children from accessing harmful and age-inappropriate content and give adults more control over the types of content they want to see.
Ofcom regulates Online Safety and intends to compile a Code of Practice for providers. But parents are saying that Ofcom has put too much emphasis on the internet service provider's rights, and asking how age verifications can be made more valid than asking someone if they're over 18.
There's a reason. The Act became law on 26 October 2023. After only nearly an entire year, Ofcom published its plans to enforce the law. Now, after another six months, they're thinking of bringing the new protections into effect. The government also needs to make some secondary legislation, but naturally, that takes a lot longer than 18 months to sort out, unless someone thinks it's a vote catcher.
It's not all delay. As of March 17 2025 service providers must have now completed their assessments of the risk of illegal content appearing on their service. Whether they have or not is moot. As of 17 January 2025 porn platforms have to have robust age checks in place. You can judge for yourself whether or not that's happened. Ofcom also specified that as of 16 April 2025, internet providers have to have determined if children are likely to access thier services and as of Spring this year they also have to risk-assess harm to children.
Once the regulations to set the thresholds have been laid and approved by Parliament, Ofcom will publish a register setting out which services fall into which categories and will publish further codes of practice for consultation. Ofcom expects to publish the register of categorised services in Summer 2025 and consult on the codes of practice and, where relevant, guidance for the additional duties on categorised services by early 2026.
New Online Safety Act offences
The criminal offences introduced by the Act came into effect on 31 January 2024. These offences cover:
- encouraging or assisting serious self-harm
- cyberflashing
- sending false information intended to cause non-trivial harm
- threatening communications
- intimate image abuse
- epilepsy trolling
These new offences apply directly to the individuals sending them. They don't apply to the owners of the Internet services, so Mark Zuckerberg and and Elon Musk needn't lose any sleep, tonight or any other night.
The illegal content duties cover the removal and prevention of illegal content. Platforms need to think about how they design their sites to reduce the likelihood of them being used for criminal activity in the first place.
The kinds of illegal content and activity that platforms need to protect users from rightly include
- child sexual abuse
- controlling or coercive behaviour
- extreme sexual violence
- extreme pornography
- fraud
- racially or religiously aggravated public order offences
- inciting violence
- illegal immigration and people smuggling
- promoting or facilitating suicide
- intimate image abuse
- selling illegal drugs or weapons
- sexual exploitation
- terrorism
Harmful to children
Depending on their age, some perfectly legal content could be harmful or age-inappropriate for children and platforms need to protect children from it.
Companies with websites likely to be accessed by children now have to block porn and content that encourages self-harm, eating disorders and suicide, bullying, pictures of serious injury, violence, dangerous stunts and challenges and drug-taking. Now. For the past 20 years, apparently it's been absolutely fine.
What's this got to do with Helping Cards? There's a clue in the pictures. One of the biggest criticisms of the Act, apart from it being too little, too late, and the total impossibility of winning a court case against Meta on the grounds of cost alone, is the role of the responsible adult. You need to know what they're looking at online. And you can only do that by either looking at the same things, which isn't always practical unless you want to be up at 3am doom-scrolling with them, or by talking to them.
Grunts are not conversation. Homes don't look after themselves. Helping Cards help build the communication skills that can keep kids safe, along with their confidence and their capabilities.