It’s officially been a year of this (grand gesture). Though it’s been longer. And that is the philosophical debate of the day. Data protection picks have upped their game in a surprising turn of events, bittersweet, to say the least. So, hi, again.

🤖 Artificial Intelligence
AI legislation is reportedly well underway across the United States, just not in the form of one grand federal AI Act. The action is spreading through state legislatures in a very familiar pattern: chatbot safety, children’s protection, deepfakes, health insurance decisions, workplace surveillance, AI personhood, and disclosure duties. California is, predictably, doing California things with a small mountain of bills, while states like Tennessee, Nebraska, Maryland, Idaho, Washington and Utah are already moving concrete measures through or into law. What is interesting is how sectoral the whole picture has become. Legislatures are not really regulating “AI” in the abstract; they are regulating AI where it touches recognisable legal anxieties, children, healthcare, employment, elections, intimate images and consumer pricing. In other words, the US may not have one AI Act, but it is quietly building many smaller ones.
🪁 Children’s Rights in Cyberspace
Ofcom launched investigations under the Online Safety Act 2023 into Telegram, as well as teen chat platforms Teen Chat and Chat Avenue, over concerns relating to child sexual abuse material (CSAM) and online grooming risks. The Telegram probe follows evidence suggesting the presence and sharing of CSAM on the platform, while the investigations into the chat services focus on whether they have adequately assessed and mitigated risks to children, particularly in environments with open chatrooms and private messaging. At the same time, Ofcom highlighted some compliance movement among file-sharing services, including the adoption of hash-matching technologies and, in some cases, withdrawal from the UK market altogether. The investigations form part of Ofcom’s broader enforcement approach, which can ultimately lead to significant fines or even service disruption measures where providers fail to meet their statutory duties. Not quite a crackdown yet, perhaps, but certainly no longer a gentle nudge.
🔏 Data Protection & Privacy
The German Federal Court of Justice is reportedly considering whether secretly filming family members in one’s own kitchen falls outside the GDPR under the household exemption. The facts are wonderfully domestic and legally messy, a daughter and son-in-law filmed their downstairs kitchen, which the mother was allowed to enter, and later shared footage with the police and another daughter in connection with suspected theft. The lower court treated this as private household activity, but the Court appears less convinced, especially because the purpose may have gone beyond ordinary domestic monitoring and into evidence-gathering for a criminal complaint. A referral to the CJEU is apparently being seriously considered. So the question is not just whether you can install cameras at home, but when “home” stops being purely private for data protection purposes. Regulation, it seems, follows you all the way into the kitchen.

The French Data Protection Authority-CNIL has updated its recommendation on electronic remote voting systems following a public consultation, aiming to provide a clearer and more operational framework while maintaining high standards of security, confidentiality and integrity of the vote. The revised guidance keeps its risk-based structure but refines how risk levels are assessed, integrates a self-assessment tool directly into the recommendation, and places greater emphasis on transparency, including publishing technical specifications and, for high-risk elections, parts of the voting software. It also adopts a more technologically neutral approach, focusing on outcomes rather than specific tools, and adjusts requirements around independent audits depending on the sensitivity of the election. The new framework will aply to future elections, with a transitional period allowing ongoing 2026 processes to rely on the previous version. Flexibility, in other words, but only the kind that comes with a fairly thick compliance manual.
🛒 E-Commerce & Digital Consumer
Starting 19 June 2026, Germany will reportedly require a visible and continuously accessible “Widerrufsbutton” – an online withdrawal button for online B2C distance contracts, following the implementation of Directive (EU) 2023/2673 into § 356a BGB. The rule effectively operationalises the right of withdrawal by embedding it directly into the user interface: consumers must be able to withdraw from contracts via a clearly labelled, prominent function , without friction such as mandatory login or additional steps. The provision also standardises the process with input fields, confirmation step, immediate receipt confirmation on a durable medium, and clarifies that withdrawal is timely once submitted through the function within the withdrawal period. Importantly, the obligation extends beyond a trader’s own website to third-party platforms where the trader remains the contractual counterparty, requiring contractual alignment with platform operators. From a data protection perspective, the collection of personal data through the withdrawal function is anchored in legal obligation, but still triggers familiar compliance duties like data minimisation, transparency, security. Talk about regulation by design.
🐆 AI in the Wild
Meta is reportedly rolling out an internal tool (MCI) to capture employee screen activity, including keystrokes and mouse movements across platforms like Google, LinkedIn, and Wikipedia, to train its AI models. The rationale is straightforward: if AI agents are meant to replicate real user workflows, they need granular behavioural data on how people actually use computers. The project, however, has raised internal concerns about privacy, proportionality, and the potential capture of sensitive information, even if Meta insists safeguards are in place. What looks like a technical training exercise therefore also starts to raise familiar questions about workplace surveillance, data minimisation, and the limits of “functional necessity” in AI development. At some point, the distinction between using a system and being used by it begins to blur.
📄 Recommended Readings
Here’s a couple –in no particular order– of recent publications that piqued my interest this week. Remember to grab a cuppa and settle in for some riveting reading.
Data scraping for scientific research purposes: legal bases under the GDPR by Roxanne Meilak Borg & Mireille M. Sant
Anti-piracy enforcements and innovation quality by Dyuti S. Banerjee & Sougata Poddar
*Disclaimer: I am in no way affiliated with the authors or publishers in sharing these, and do not necessarily agree with the views contained within.
If you’d like to see more recommended readings, visit the collection here.
So there you have it, folks, another week in the fascinating realm of IT Law. Remember to pop back next week for your latest dose of legal updates.

Featured image generated using DALL·E 3.