Berry Picks in IT Law #51

On the perils of defining, or rather classifying loosely, the different subheadings of IT Law. This week saw one reluctantly going from one heading to another whilst listing the updates. It’s another lesson on why not to judge a book by its cover, or heading, quite literally. They’re all in the realm of IT Law, after all.

🤖 Artificial Intelligence

The Bank of England is reported to be actively testing AI-related risks to the financial system through scenario analysis and simulations, with a particular focus on whether AI agents could produce herding behaviour in markets and intensify selloffs during stress. This matters because it pushes the debate beyond abstract concern and into financial stability territory. The Bank is also clearly resisting the suggestion that it has taken a passive approach, insisting it is already analysing how AI investment and adoption are reshaping the system. At the same time, the wider institutional picture still looks uneven. Apparently, MPs remain frustrated that HM Treasury has not moved faster to bring major AI and cloud providers into the Critical Third Parties regime, which is meant to bring systemically important tech and cloud providers under direct financial oversight, leaving a familiar gap between recognising systemic dependence and actually regulating it. The regulatory response may still lag behind, but the Bank is already doing what it’s know for: stress-testing the system before it breaks.

🔏 Data Protection & Privacy

The EDPB adopted new Guidelines on processing personal data for scientific research, with an aim of making GDPR compliance more workable for researchers without relaxing the fundamentals. The Guidelines try to pin down the concept of “scientific research” through six indicative factors, confirm that further processing for scientific research is presumed compatible with the original purpose, and clarify issues such as broad consent, role allocation, and possible limits on rights like erasure and objection. Alongside this, the Board has set up a sprint team to finalise its anonymisation guidance and adopted two Europrivacy opinions, including one recognising certification as a transfer tool. The Guidelines are now open for public consultation until 25th June 2026, so the position is not final just yet. The overall direction, however, is quite clear. The EDPB wants research governance to look more usable in practice, while still keeping data protection firmly in the frame.

🛒 E-Commerce & Digital Consumer

The European Commission sent Meta a Supplementary Statement of Objections indicating that it may impose interim measures requiring the company to restore third-party AI assistants’ access to WhatsApp under the same conditions as before its October 2025 policy change. The Commission’s preliminary view is that Meta’s shift from an outright ban to a paid access model does not meaningfully alter the competitive outcome, as it may still deter or prevent rivals from entering or expanding in the emerging AI assistant market, particularly where Meta itself offers a competing service. What is notable here is the use of interim measures under Article 8 of Regulation 1/2003, which allows the Commission to intervene before reaching a final infringement decision where there is a risk of serious and irreparable harm to competition. In effect, the case is less about formal access rules and more about whether control over a key interface like WhatsApp can be used to shape who participates in the next layer of AI-driven services.

The European Commission proposed compliance measures for Google under the Digital Markets Act, focusing on one of the most strategically important inputs in digital markets, search data. In essence, the Commission wants Google to give rival search engines access to ranking, query, click and view data on fair, reasonable and non-discriminatory terms, and it is explicitly considering AI chatbots with search functions among the potential beneficiaries. That last point is doing a great deal of work. This is no longer just about classic search competition, but about who gets access to the raw material needed to build the next generation of search and AI services. The Commission is therefore not simply policing conduct after the fact, it is trying to specify the conditions under which contestability can exist in practice. The consultation is up for contribution until May 1st, go nuts.

An inquiry into the Southport mass stabbing in the UK is now reportedly turning to the role of tech platforms, not because they caused the attack, but because of how they shaped the surrounding environment. The report points to the perpetrator’s exposure to violent content online, the ease with which age restrictions could be bypassed, and the failure of safeguards to meaningfully limit access to harmful material. It also highlights how, in the aftermath, misinformation spread rapidly across platforms, contributing to unrest and real-world harm. What matters here is the underlying logic. The inquiry is not primarily concerned with isolated pieces of illegal content, but with the systems that make such content accessible, recommendable, and amplifiable. Weak age verification, ineffective filtering, and algorithmic promotion all become part of the analysis. In that sense, the focus on platforms reflects a broader regulatory shift toward systemic risk, where the question is not just what content exists, but how platform design choices shape exposure and scale harm.

👩🏼‍🎨 Intellectual Property

The CJEU confirmed that offline streaming copies do not fall within the private copying exception where the streaming provider, rather than the user, makes the copy on the user’s device and the rightholder retains technical control over it. The Court’s logic is fairly straightforward. This is not really a case about a user freely making and keeping a private copy, but about a provider making a controlled offline version available within a closed service environment. The user cannot move it, transfer it, reproduce it outside the service, and access can be blocked or withdrawn by the rightholder. In those circumstances, the copy looks less like private copying and more like another form of making available to the public under Article 3 InfoSoc. The Court also emphasised that where such copies remain subject to technological protection measures and the rightholder retains control over authorisation, the kind of harm that justifies fair compensation does not arise in the usual way. This matters because it blocks attempts to treat these files as levy-relevant private copies and helps avoid a situation in which rightholders are remunerated both through platform licensing and through private copying levies on devices. The judgment is therefore quite helpful in drawing a firmer line between genuinely private reproductions and tightly controlled platform-based access models. Not every copy on a user’s device is a “private copy,” especially where the platform never really lets go.

💻 Tech in the Wild

The European Commission’s new age verification app can apparently be compromised in a matter of minutes, with researchers pointing to fairly basic weaknesses in how user credentials such as PINs are stored and managed. In practice, this could allow an attacker to take over a user profile without much effort.  The issue is therefore not some exotic technical flaw, but rather ordinary implementation weakness in a tool that is meant to serve as regulatory infrastructure. That is what makes the story more interesting than a standard security mishap. Once age assurance is built into the architecture of compliance, the verification tool itself becomes part of the enforcement model. Any weakness is no longer just a product flaw, it becomes a regulatory vulnerability. It is a fairly neat reminder that regulation by design shifts not only control upstream, but risk as well.

📄 Recommended Readings

Here’s a couple –in no particular order– of recent publications that piqued my interest this week. Remember to grab a cuppa and settle in for some riveting reading.

Unveiling transparency in data protection enforcement across the EU: Assessing the level and quality of disclosure of GDPR fines by data protection authorities by Pablo Marcello Baquero, Aluna Wang & David Restrepo Amariles

Antitrust rules and remedies against platforms’ treacherous turns by Friso Bostoen & Nicolas Petit

*Disclaimer: I am in no way affiliated with the authors or publishers in sharing these, and do not necessarily agree with the views contained within.

If you’d like to see more recommended readings, visit the collection here.

So there you have it, folks, another week in the fascinating realm of IT Law. Remember to pop back next week for your latest dose of legal updates.

Featured image generated using DALL·E 3.

Sena Kontoğlu Taştan

IT law enthusiast and researcher.

Leave a Reply

Your email address will not be published. Required fields are marked *