Ireland is adopting online safety rules for video sharing platforms including TikTok

[ad_1]

Ireland’s media and internet watchdog, Coimisiún na Meán, has adopted and published a statement Internet security code This will be rolled out to video sharing platforms headquartered in the country from next month – including the likes of Bytedance’s TikTok, Google-owned YouTube, Instagram’s Meta and Facebook Reels.

Under the Code, platforms within scope are required to have terms and conditions that prohibit uploading or sharing a range of types of harmful content – ​​including cyberbullying; Promote self-harm or suicide and promote eating or nutritional disorders, as well as blocking content that incites hatred or violence; terrorism; child sexual abuse material (CSAM); Racism and xenophobia.

Coimisiún na Meán spokesperson Adam Hurley confirmed that the aim of the code is to address types of content that do not directly fall within the scope of the EU Digital Services Act (DSA).

This latest law, a pan-EU law, has been widely implemented since mid-February and focuses on managing illegal online content (such as CSAM), rather than addressing the broader set of harms that the Coimisiún na Meán law aims to address.

“One of the ideas behind the Online Safety Act is to deal with the most harmful content, not the illegal,” Hurley told us, adding: “What we did was broaden the scope to include harmful content that they should block from uploading and then publishing it.” Acting on reports violating these terms and conditions.

“It’s a ban on uploading in their terms and conditions. So they have to block the upload of those types of content on their own terms and conditions, and then they’ll have to enforce those terms and conditions,” he added.

The rules will only apply directly to video services provided to users in Ireland, including several major social media platforms that fall under its jurisdiction due to their regional headquarters in the country. However, technology companies may choose to apply the same measures in the rest of the region to simplify compliance and avoid awkward questions about inconsistencies in content standards.

Notice and removal

Another element worth noting here is that EU law prohibits the imposition of a general monitoring obligation on platforms. So Ireland’s Internet Safety Act will not require platforms to deploy upload filters, according to Hurley. Instead, he emphasized that it is essentially an expansion of the existing notice and takedown approach — by also allowing users to report harmful content and expect platforms to remove it.

As with the DSA, the Code requires platforms to have ways for people to report the types of harmful content mentioned above so they can act on reports in line with their terms and conditions.

Lifetime guarantee for porn

In other requirements, the Code states that video sites that allow pornographic content or gratuitous violence in their terms and conditions must implement an “appropriate” age guarantee (or age verification) in an effort to ensure that minors do not access inappropriate content.

Hurley said there are no approved techniques for ensuring age per se; Rather, the regulator will assess what is appropriate on a case-by-case basis.

The blog also requires video sharing platforms that carry such content to create user-friendly content classification systems.

Platforms must also provide parental controls for any content that may “impair the physical, mental or moral development of children under the age of 16,” as stated in the press release from Coimisiún na Meán.

Recommendation systems

In terms of recommendation systems, the Irish regulator previously considered requiring video-sharing platforms to turn off profile-based content recommendations by default as a safety measure – which could have led to a scenario where TikTok was forced to turn off its algorithm by default.

However, after consultation last year, Kwimision Na Mian spokesman confirmed that the measure had not made it to the final law. “It was considered a potential supplement (to the Code) but we have come to the position that the best way to deal with recommender systems – the potential harm of recommender systems – is through the (EU) Digital Services Act.” He told TechCrunch.

We asked the regulator how the rules mitigate harms caused by algorithmic amplification, another stated goal.

The final Code forms part of Ireland’s comprehensive online safety framework which aims to ensure digital services are responsible for protecting users from online harm – which falls within the country’s Online Safety and Media Regulation Act.

The EU’s DSA law applies throughout the bloc, so it is also in effect in Ireland, where Coimisiún na Meán is responsible for applying the general rules of regulation to any locally based company in the domain – as well as overseeing the new online safety law.

Commenting, Ireland’s Online Safety Commissioner, Niamh Hodnett, said: “The adoption of the Online Safety Code brings an end to the era of social media self-regulation. The Code sets out binding rules that video sharing platforms must follow in order to reduce the harm they can cause to users.” We will work to make sure people know their rights when they go online and we will hold platforms accountable and take action when platforms do not meet their obligations.

In another supportive statement, Coimisiún na Meán CEO Jeremy Godfrey added: “With the adoption of the Online Safety Act, all elements of our online safety framework are now in place. Our focus now is on fully implementing the framework and bringing positive changes to people’s lives online.

“Our message to people is clear: if you come across something that you believe is illegal or against the platform’s rules of what it allows, you should report it directly to the platform. Our contact center is available to provide advice and guidance to people if they need help.

Concerns about children’s safety are behind a growing number of online safety initiatives on both sides of the Atlantic in recent years. These include the UK’s Online Safety Act (which became law just over a year ago) and the Age-Appropriate Design Act (which came into force in the UK in the fall of 2021). A bill focusing on Child Safety in the United States (KOSA) is also being prepared. It has been proposed again in 2022.

[ad_2]

Leave a Comment