Sunday, December 22, 2024

Tech giants face large fines if they ignore new safety rules – Irish watchdog

Must read

Social media companies must comply with a new online safety code or face fines of at least 20 million euros, Ireland’s media regulator has said.

The independent body Coimisiun na Mean published an Online Safety Code for 10 video-sharing platforms that are based in Ireland, including Facebook, Instagram, YouTube, TikTok and X.

The code requires the tech giants to protect children against content that could impair their physical, mental or moral development, such as cyber bullying, eating disorders, or posts that incite hatred or violence against protected characteristics.

Online Safety Commissioner Niamh Hodnett said that it marked the end of the era of self-regulation of social media.

She said: “We really want to see a step change in how the platforms have been behaving.

“We want to see clear evidence of behavioural change.

“For too long, people have felt that the online world is the Wild West or there are no effective measures in place, and today marks the end of the era of self-regulation, and there’s now an effective statutory regime in place.

“We’ll be overseeing the platforms to ensure that they’re complying with those obligations, and we can hold them to account where they don’t.”

Ms Hodnett said that if platforms do not comply with the code, most of which comes into force from November, they will face fines of up to 10% of turnover or 20 million euros, whichever is the greater.

In relation to age verification, she said self-declaration alone was “not appropriate” but said the regulator was not “mandating” what method should be used instead.

The platforms will also need to provide more “granular” details to the regulator under Part B of the code, but they will have until July 2025 to put these technical solutions in place.

John Evans, the Digital Services Commissioner from Coimisiun na Mean, speaking at the launch of the Online Safety Code (Niall Carson/PA)

Digital Services Commissioner John Evans said the regulator has tools to track that platforms are complying with the code, can act on complaints and take its own actions.

But he added: “Coimisiun na Mean is not a content regulator… we’re not a censor.

“Our role is to make sure that the platforms operate the mechanisms that I was talking about, so for example, flagging complaints mechanisms.”

When asked what a successful outcome of the code would be, Mr Evans said more people reporting concerns would be “an important part”.

“Say, in 18 months to two years, what would be really great is if we can see people are reporting, there are data points arising from dispute settlement and so on.

“All of that gives us key information on how things are working, systemic information, so we could spot where the risks are and where the failings are.

“Using reporting mechanisms, yes, is an important part of it.”

Asked what he would say to social media users who have no confidence in reporting posts on social media, Mr Evans asked people to keep reporting problems and to get in touch with Coimisiun na Mean if they are unhappy with the outcome.

“This is a problem.

“We do hear a lot about reporting fatigue, and we ask people to try and get over that.

“There is value in reporting.

“It creates data points for us to then go back and look at whether the system is working well.

“If you’re unhappy, will you flag something, with the response you can get, you can come to Coimisiun na Mean and raise a complaint.

“We will then look at the pattern of these complaints and decide whether or not there are issues there we might address through supervision or escalate through enforcement.”

Ms Hodnett said the code was created with engagement from all of the platforms, but added that “engagement is a very different matter to compliance” and that they would be overseeing how platforms act on the code.

Latest article