On Monday, Ofcom released its first online safety codes, and has done so four months earlier than they needed to. This early release allows companies extra time to prepare for the Online Safety Act’s requirements.
More than 200 responses were received during the consultation period, and this includes input from charities, law enforcement, and civil society groups. These contributions formed the safety measures that will deal with existing online issues such as terrorism, fraud, hate speech, and child exploitation.
This was an urgent response towards regulating online platforms, as well as prioritising user safety in digital spaces.
What Changes Will Tech Companies Need To Make?
Platforms must conduct assessments on risks related to illegal content by 16 March 2025. Starting from 17 March, they are expected to apply measures such as improved content moderation, better reporting tools, and advanced detection systems.
Larger platforms have even more obligations, such as naming a senior individual responsible for compliance and having to use automated tools to detect illegal materials like child abuse content.
These platforms must also introduce stricter safety measures to protect children, by restricting interactions with unknown users.
Smaller companies might find this transition more difficult due to limited resources. Ofcom has promised guidance to help them meet the rules, but there are still worries about how smaller platforms will adapt, regardless.
Dame Melanie Dawes, Ofcom’s Chief Executive commented, “For too long, sites and apps have been unregulated, unaccountable and unwilling to prioritise people’s safety over profits. That changes from today.
“The safety spotlight is now firmly on tech firms and it’s time for them to act. We’ll be watching the industry closely to ensure firms match up to the strict safety standards set for them under our first codes and guidance, with further requirements to follow swiftly in the first half of next year.
“Those that come up short can expect Ofcom to use the full extent of our enforcement powers against them.”
How Will These Rules Improve Online Safety?
The new measures that are set to protect from online risks, such as grooming, harassment, and scams are a major contribution to improving overall safety on the internet. For children, the protections intend on limiting their contact with strangers, making their profiles less visible and discoverable, and finally, blocking unsolicited messages.
Women and girls, who are frequently targets of abuse online, will have better tools to block and report offenders. Platforms will also be required to remove intimate images shared without consent and respond to reports of harassment better.
Fraud prevention is another priority, with platforms expected to work with experts to identify and deal with scams quickly as a way to create a safer experience for all users.
What Powers Does Ofcom Hold To Enforce The Rules?
Ofcom has authority to fine companies up to £18 million or 10% of their global revenue for failing to comply with the new safety rules. In severe cases, sites could face being blocked in the UK.
The regulator has already started working with tech companies to guide them through the requirements. Some platforms have taken action early, and others may face penalties if they fail to meet the standards on time.