![]()
The investigations will probe whether the platforms make it easy to report illegal content.
Ireland’s media regulator, Coimisiún na Meán (CnaM), will be investigating LinkedIn and TikTok under the EU Digital Services Act (DSA) over suspicions that the platforms’ content reporting mechanisms are not up to code.
This is not TikTok’s first brush up against the DSA. The platform was found to have broken the law in a preliminary finding earlier this year.
The new investigations, announced yesterday (2 December) evening, will look into whether illegal content reporting mechanisms implemented by TikTok and LinkedIn are easy to access and user friendly, and whether the mechanisms allow users to anonymously report suspected child sexual abuse material (CSAM).
In addition, the investigations will also probe whether the content reporting mechanisms “deceive” users from reporting potentially illegal material.
The investigations materialised after CnaM began reviewing a number of online platforms last year to check their compliance under DSA’s Article 16.
Article 16 concerns the ‘Notice and Action’ mechanisms which service providers are required to have in place to allow people to report content that they suspect to be illegal. A part of the article also regulates reporting CSAM.
During its review, concerns arose around potential “dark patterns” – or deceptive interface designs –around illegal content reporting mechanisms, the media watchdog said. Specifically, that reporting mechanisms could be deceiving users into believing that they were reporting potentially illegal content, as opposed to content that violated the platforms’ terms and conditions.
“At the core of the DSA is the right of people to report content that they suspect to be illegal, and the requirement on providers to have reporting mechanisms, that are easy to access and user-friendly, to report content considered to be illegal,” said John Evans, the digital services commissioner at Coimisiún na Meán.
“Providers are also obliged to not design, organise or operate their interfaces in a way which could deceive or manipulate people, or which materially distorts or impairs the ability of people to make informed decisions.”
SiliconRepublic.com has reached out to both the platforms for comment.
In a statement to the press, a LinkedIn spokesperson said that it is committed to keeping its platform safe and that it has effective reporting mechanisms in place. Meanwhile, TikTok, in a similar comment, said that it is also committed to keeping its platform safe.
“We have received a notice of investigation today,” a TikTok spokesperson said, adding, “We will review it in full and engage with Coimisiún na Meán.”
A number of other service providers have made “significant changes” to their reporting mechanisms, the commissioner said. The watchdog is currently assessing their effectiveness and has not ruled out further regulatory actions, he added.
Just weeks ago, the CnaM launched a DSA investigation into X over whether the Elon Musk-owned platform allowed users to appeal its decisions not to remove content which they report as breaking the platform’s own policies.
Meanwhile, the EU is considering expanding on the DSA to potentially hold social media sites liable for financial frauds on their platforms.
Don’t miss out on the knowledge you need to succeed. Sign up for the Daily Brief, Silicon Republic’s digest of need-to-know sci-tech news.


