Berry Picks in IT Law #26

Hello and welcome to the last quarter of the year that seems to have gone on forever. The spotlight this week is particularly important so let’s keep the pleasantries sweet and short, shall we?

🔦 Spotlight

29 civil society organisations, including Access Now and ARTICLE 19, urge EU Commissioner Thierry Breton to respect due process when enforcing the DSA. The open letter reads: “Precise interpretation of the DSA matters especially when people’s lives are at risk in Gaza and Israel.” The EC had formally issued request for information to Meta, X and YouTube under the DSA due to concerns about the spread of illegal content, as last week‘s readers may recall. Lawyers and activists have, quite rightly, opposed such an intervention. Civil society organisations have come together to address these concerns in an open letter to the Commissioner:

1-Disinformation vs. Illegal Content: The letters appear to conflate “disinformation” with strictly illegal content. It’s crucial to note that not all disinformation falls under the category of illegality. The DSA distinguishes between these, offering targeted measures for illegal content, while taking a nuanced approach, emphasizing transparency and due diligence for VLOPs, for broader issues like disinformation. The European Commission should adhere strictly to the DSA’s guidelines and ensure these distinct categories remain separate.
2-Timeliness, Not Deadlines: The DSA does not prescribe specific timelines, such as the 24-hour window, for addressing content concerns. Instead, calls for a timely, objective, and non-arbitrary response from providers. Publicly asserting a 24-hour response time, to the Commissioner or their teams, not only misrepresents the DSA but might also affect the credibility of the DSA Enforcement Team.
3-Service Provider Autonomy: The DSA emphasizes that service providers should operate diligently, objectively, and proportionately. It doesn’t mandate that they “consistently and diligently enforce [their] own policies.”. It’s important to recognize that enforcing policies too rigorously, especially under state pressure, can lead to undue censorship, potentially sidelining legitimate content.
4-Reporting Mechanisms: While the DSA obligates service providers to notify authorities about potential criminal offenses, it doesn’t dictate a specific time frame, like the aforementioned 24 hours. The letters’ push for immediate communication with law enforcement and EUROPOL seems to lack a clear legal basis, specifically in terms of defining the nature and gravity of the crimes in question.

“Freedom of expression and the free flow of information must be vigorously defended during armed conflicts. Disproportionate restrictions of fundamental rights may distort information that is vital for the needs of civilians caught up in the hostilities and for recording documentation of ongoing human rights abuses and atrocities that could form the basis for evidence in future judicial proceedings. Experience shows that shortsighted solutions that hint at the criminal nature of “false information” or “fake news” — without further qualification — will disproportionately affect historically oppressed groups and human rights defenders fighting against aggressors perpetrating gross human rights abuses.

We would like to reiterate our support for a robust enforcement of the DSA. But that enforcement must always follow due process as prescribed by law.

🤖 Artificial Intelligence

The World Health Organisation published considerations for regulation of artificial intelligence for health. The report offers an overview on regulatory considerations on AI for health, focusing on six areas: documentation and transparency, the product lifecycle and risk management, intended use and validation, data quality, privacy and data protection, and stakeholder engagement and collaboration. The working group behind the report stresses the rapid evolution of the AI landscape and the need for continuous engagement, shared understanding, and mutual learning among stakeholders. So keeping the health guru AI party going, sounds reasonable.

US Senators proposed the NO FAKES Act. The quirky acronym, the type of things we like to see here on this blog, stands for the “Nurture Originals, Foster Art, and Keep Entertainment Safe Act”- which sounds like a slogan at a suburban mothers rally, no judgement. The act would define and protect digital replicas – a newly-created, computer- generated, electronic representation of the image, voice, or visual likeness of an individual that is nearly indistinguishable from the actual image, voice, or visual likeness of that individual; and is fixed in a sound recording or audiovisual work in which that individual did not actually perform or appear. So essentially, deepfakes. It would also create a digital replication right – the right of the individual in question to authorise the use of their image, voice or visual likeness. Thinking of venturing into creating a computer-simulated version of someone’s voice or appearance? Tread carefully. The legislation includes liability for individuals or companies that produce digital replicas without authorisation. Platforms also get their share of liability for hosting, unless they do not have knowledge of unauthorised production, so standard platform liability? Creative.

🔐 Cybersecurity

ENISA published its Threat Landscape 2023. The report dentifies the top threats, major trends observed with respect to threats, threat actors and attack techniques, as well as impact and motivation analysis. Think of it like a “who’s who” of digital nasties. Over 2022 and into 2023, there was a noticeable spike in both the diversity and number of cyberattacks, exacerbated by events such as the conflict in Ukraine. Prominent threats identified include old friends- ransomware, malware, social engineering, threats to data, and various challenges related to system availability. Current observations indicate that DDoS and ransomware are the prevailing cybersecurity concerns. There is a noticeable enhancement in malevolent cyber activities, particularly through advanced ‘as-a-Service’ methodologies. Public institutions have emerged as the principal targets, and -with no surprise- the incorporation of Artificial Intelligence in social engineering tactics is a burgeoning trend in these adversarial endeavors. Juicy.

🔏 Data Protection & Privacy

The EDPB and the EDPS published a joint opinion on the Digital Euro Regulation Proposal. Embarking on the third year since folks at the European Central Bank flirted with the idea of a digital euro, EU is now diving into the fine print of its establishment. The opinion includes concerns about clear guidelines on distribution, data protection within unique identifiers, and clarity on personal data processing. They also recommend including references to the cybersecurity legal framework and advocate for selective privacy for low-value online transactions.

🛒 E-Commerce & Digital Consumer

The European Commission adopted the Delegated Regulation on independent audits to assess compliance of VLOPs and VLOSEs under the Digital Services Act. Under the DSA, VLOPs and VLOSEs are required to undergo annual independent audits to assess their compliance with DSA obligations. These rules provide guidelines for verifying the independence and competence of these auditors, and highlights the principles auditors should adhere to during audits. So the DSA’s way of saying, “Here’s how to pick your auditor (no, not your best friend) and what they should be looking out for.” And to keep things neat and tidy, there are fancy templates for the audits and reports, so everyone’s singing from the same song sheet. Services designated in April of 2023 will have until August 2024 for their first audit. Meanwhile, the rules are at the European Parliament and Council, and if they don’t pass it back with red marks in three months, it’s a go.

📄 Recommended Readings

Here’s a couple –in no particular order– of recent publications that piqued my interest this week. Remember to grab a cuppa and settle in for some riveting reading.

When the Digital Services Act Goes Global by Anupam Chander

EU Copyright Directive: A ‘Nightmare’ for Generative AI Researchers and Developers? by Theodoros Karathanasis

Disclaimer: I am in no way affiliated with the authors or publishers in sharing these, and do not necessarily agree with the views contained within. I try to include mostly open access publications due to, well you know, accessibility of knowledge and science.

If you have any thoughts or suggestions on how to make this digest more informative and enjoyable, feel free to drop a line. Your feedback is always welcome!

Featured image generated using DALL·E 3.

Sena Kontoğlu Taştan

IT law enthusiast and researcher.

Leave a Reply

Your email address will not be published. Required fields are marked *