Privacy & Cybersecurity #30
UK Finalizes Data Act | Italy Aligns ISO 27001 with NIST CSF | EDPS & AEPD on Federated Learning | CNIL Guidance on Pixels & Workplace Diversity | Canadian OPC Annual Report Released
🇬🇧 UK Passes Data Use and Access Bill
On 4 June 2025, the UK Parliament passed the Data Use and Access Bill (Bill 3825), a legislative instrument that creates a new statutory basis for data access in the UK. The Bill now awaits Royal Assent and is expected to come into force later this year.
The Bill establishes a framework to facilitate the controlled access, use, and sharing of data in secure environments, particularly for research and public interest purposes. It supports the UK’s National Data Strategy and is a cornerstone of the government’s plan to promote data-driven innovation while maintaining public trust.
Summary of Key Reforms
The Bill introduces targeted amendments to the UK GDPR and Privacy and Electronic Communications Regulations (PECR) alongside with a new framework for lawful public-interest data use. Changes include:
Recognized Legitimate Interests: The DUA creates a statutory list of recognized legitimate interests—such as disclosure to person carrying out a public interest task, protecting public security, defense, safeguarding national security, safeguarding vulnerable people, detecting, investigating, preventing crime and others. This removes the need for organizations to conduct a Legitimate Interests Assessment (LIA) when processing for these purposes.
Purpose Limitation Exceptions. New rules allow certain data repurposing—such as for public interest archiving or scientific research—without returning to the data subject, provided specified safeguards are met.
Automated Decision-Making (ADM) Adjustments. The Bill narrows the scope of ADM restrictions while keeping limits on the use of sensitive data. Human involvement must be meaningful, and significant decisions affecting individuals require special justification.
Subject Access Requests (SARs) Streamlined. Controllers may now pause response time while awaiting identity verification and are required to conduct only reasonable and proportionate searches.
Cross-Border Data Transfers. A new “data protection test” would replace Chapter V of UK GDPR, enabling a risk-based approach while aiming to preserve the UK’s EU adequacy decision—currently extended until December 27, 2025.
PECR: The Bill introduces exemptions for certain cookie uses (e.g., functionality, statistics, personalization) and allows charities to use soft opt-ins.
AI and Copyright Transparency Provisions
The Bill does not contain direct obligations requiring AI model developers to disclose training data or the use of copyrighted content. These provisions, originally introduced in the Lords, were removed during the final stages of debate.
Instead, the Bill now requires the Secretary of State to introduce draft legislation within a specified timeframe. This forthcoming draft must include proposals to enhance transparency for copyright owners whose works may be used as training data inputs for AI models. The government must also report on:
the scale and impact of potential copyright infringement in AI model development;
enforcement mechanisms; and
AI systems trained outside the UK.
The government is expected to issue secondary legislation to implement technical details of the framework, including criteria for accrediting SDEs and rules governing access procedures.
🇮🇹 Italy Aligns ISO 27001 with NIST CSF in New Cybersecurity Framework
Italy’s National Cybersecurity Agency (ACN) and standardization body UNI published UNI/PdR 174:2025, a new operational guideline developed with support from the Italian National Cybersecurity Agency (ACN). The new framework defines a cybersecurity and information security management system that aligns ISO/IEC 27001 with the NIST Cybersecurity Framework (CSF) 2.0.
The publication provides a methodological bridge for organizations already certified under ISO/IEC 27001 to incorporate the controls and security measures required under NIST CSF. This alignment also supports compliance with the NIS Directive’s implementation in Italy, particularly Articles 23 and 24 of the Italian NIS Decree, as reflected in ACN Determination No. 164179 (April 14, 2025), which sets out Italy’s national baseline security measures.
The full text of UNI/PdR 174:2025 is available for free download from the UNI Store.
🇪🇺 🇪🇸 EDPS and AEPD Issue Joint TechDispatch on Federated Learning
On 10 June 2025, the European Data Protection Supervisor (EDPS) and the Spanish Data Protection Authority (AEPD) jointly published a new issue of TechDispatch dedicated to Federated Learning (FL). The publication provides a structured overview of this privacy-enhancing machine learning technique, highlighting both its promise and its risks in relation to data protection under the EU legal framework.
What is Federated Learning?
Federated Learning (FL) is a decentralized approach to machine learning in which local devices (such as phones, wearables, or institutional servers) train models on-site using their own datasets. Instead of sharing raw data with a central server, only model parameters or updates (e.g., gradients, weights) are exchanged. This method aims to enhance privacy by keeping data local, reducing the need for large-scale data aggregation.
Data Protection Benefits
The report outlines several potential advantages of FL over centralized ML from a personal data protection perspective:
Data Minimization: As raw personal data remains on local devices, the risk of large-scale breaches and unlawful data re-use is reduced.
Enhanced Accountability: Controllers may maintain better oversight over data processing activities on each device.
Sensitive Data Protection: Especially relevant in healthcare and other sectors where special categories of data are processed, FL can support compliant use without centralizing sensitive datasets.
Consent Management and Transparency: FL enables improved user control and auditability, supporting informed consent and local transparency.
Security: Avoiding centralized data storage lowers the risk of a single point of failure and can foster secure data collaboration among entities (e.g., hospitals, banks).
Challenges and Legal Considerations
The TechDispatch flags several areas where legal and technical risks remain:
Inference and Reconstruction Risks: Attackers may infer sensitive information from gradients or the resulting model, raising concerns about membership inference and model inversion attacks.
Uncertainty Around Anonymisation: The shared model updates may still constitute personal data. A case-by-case analysis is required to assess whether information is anonymized.
Data Quality and Bias: The decentralized nature complicates efforts to detect and mitigate poor data quality or statistical bias.
Security of the Whole Ecosystem: Each node (device or organization) can become an entry point for poisoning or integrity attacks unless safeguarded through encryption, secure aggregation protocols, and trusted hardware environments.
Lack of Uniform Protection: In cross-device settings, individual users may lack the technical capability to secure devices adequately, unlike institutional actors in cross-silo FL.
🇫🇷 CNIL Opens Public Consultation on Draft Recommendation for Tracking Pixels in Email
On June 12, 2025 the French Data Protection Authority (CNIL) published a draft recommendation on the use of tracking pixels in email communications and launched a public consultation open until 10 August 2025. The document clarifies the application of Article 82 of the French Data Protection Act (transposing Article 5(3) of the ePrivacy Directive) to email tracking technologies and sets out operational guidance for ensuring compliance.
The draft recommendation targets the use of tracking pixels—small, often invisible images embedded in emails to track when and how messages are opened. It applies to any entity using or enabling such trackers, including:
Email senders (data controllers), even when delivery is outsourced.
Emailing service providers (typically processors), unless they use data for their own purposes, in which case joint controllership may arise.
List rental services and pixel technology providers, whose role (processor or joint controller) depends on how the data is used.
Email clients or webmail providers are generally not considered controllers, unless they process the pixel data themselves.
CNIL distinguishes between cases that require prior consent and those that are exempt:
Consent Required:
Individual open rate analysis for campaign optimization.
Behavioural segmentation and personalization.
Profiling for retargeting across other media (web, apps, etc.).
Fraud detection based on anomalous email behaviour.
Consent Not Required:
Security-related use for verifying user authentication (e.g., password reset emails).
Aggregate, non-individual open rate metrics, strictly for ensuring email deliverability—provided the pixels used are uniform and anonymous.
CNIL clarifies that the ePrivacy consent regime is separate from that governing the sending of emails themselves. Thus, even service or transactional emails (e.g., order confirmations) may require consent for pixels if the tracking is not strictly necessary.
Operational Recommendations:
Purposes must be clearly stated, with plain language labels and layered explanations. Consent should be purpose-specific or by coherent groups of purposes (e.g., all marketing-related).
Ideally, consent should be obtained at the time of collecting the email address. If not, a separate email without pixels must be sent to solicit consent.
The absence of user action must be interpreted as refusal. Consent must be as easy to refuse as to give.
A link enabling withdrawal of consent must be included in each email using tracking pixels.
This link should take the user to a web page where consent can be withdrawn without having to input their email address.
Controllers must maintain individualized evidence of consent and define contractually the responsibilities of third parties collecting it on their behalf.
Recommendations for Businesses
Review the use of tracking pixels in your email campaigns and identify whether individual metrics or profiling are involved.
If consent is required, ensure collection mechanisms comply with CNIL’s layered and granular guidance.
Update data processing agreements to reflect roles (controller, processor, or joint controller), and secure access to proof of consent from partners.
Monitor the final recommendation once adopted and assess implications for wider EU compliance under the ePrivacy Directive and GDPR.
The public consultation is open until 10 August 2025, after which CNIL will finalize the recommendation.
🇫🇷 CNIL Issues Guidance on Diversity Surveys in the Workplace
On 10 April 2025, the French data protection authority (CNIL) adopted a new recommendation on processing personal data in workplace diversity surveys (Délibération n° 2025-028). This guidance is aimed at private and public employers conducting self-administered surveys to measure workforce diversity in line with equality and non-discrimination efforts.
The document emphasizes strict safeguards to ensure such surveys comply with the GDPR and French constitutional principles, particularly the prohibition on collecting ethnic or racial data.
Key Requirements
Anonymization First. Wherever possible, surveys should be designed to collect fully anonymous data. This means removing not only direct identifiers (e.g., name, email) but also indirect combinations that could re-identify individuals. Once data are anonymized, they no longer fall under GDPR, but participation must still be voluntary.
When Data Are Not Anonymous. If full anonymization is not feasible, GDPR applies. The following must be in place:
Legal Basis: The CNIL recommends using legitimate interest (GDPR Art. 6(1)(f)), provided the survey supports anti-discrimination goals and is proportionate.
Purpose Limitation: Data may only be used to identify collective trends and inform organizational policies, not to make individual decisions.
Voluntary Participation: Surveys must be opt-in without coercion or reward. Refusal to participate must not have adverse effects.
Clear Information: Participants must receive transparent notices explaining the controller's identity, purpose, legal basis, data recipients, retention periods, and rights under GDPR.
Data Minimization: Only data strictly necessary to assess diversity should be collected. Free-text fields should be limited due to the risk of excessive disclosure.
Sensitive Data: For processing sensitive data (e.g., perceived ethnicity, disability), explicit consent is required per GDPR Art. 9. Consent must be specific, informed, and freely given.
Avoiding Ethnic Classifications. French constitutional law prohibits collecting or categorizing data based on ethnic or racial origin. However, questions about geographic origin, nationality at birth, or perceived discrimination are permitted if framed in a non-racial and subjective way.
Use of a Trusted Third Party. Engaging a third party to administer the survey and aggregate results adds a layer of protection. The third party should not share identifiable data with the employer and must operate under a contract defining roles and responsibilities (as controller or processor).
Retention and Impact Assessments
Raw survey data should be retained no longer than necessary—typically no more than six months after the survey closes.
Statistical outputs, once anonymized, may be retained for longitudinal analysis.
A Data Protection Impact Assessment (DPIA) is strongly recommended due to the potential risks to data subjects’ rights.
Recommendations for Businesses
Default to anonymous survey design where possible.
Consult your DPO and carry out a DPIA before launching the survey.
Ensure lawful bases, clear purposes, and effective information notices are in place.
Use structured, multiple-choice formats to avoid overcollection.
Avoid any attempt to infer or record ethnic identity.
Where sensitive data are collected, use explicit consent and ensure easy withdrawal.
Consider a trusted third party to handle collection and aggregation, especially in smaller organizations.
🇨🇦 Canada’s Privacy Commissioner Stresses Legislative Reform and AI Governance in 2024–2025 Annual Report
The Office of the Privacy Commissioner of Canada (OPC) published its 2024–2025 Annual Report to Parliament on June 6, 2025. The report reiterates the Commissioner’s longstanding calls for legislative modernization and highlights key regulatory actions, particularly concerning artificial intelligence and children's privacy.
Commissioner Philippe Dufresne used the report to again urge Parliament to adopt Bill C-27, the Digital Charter Implementation Act, which would introduce the Consumer Privacy Protection Act (CPPA) and establish the new Artificial Intelligence and Data Act (AIDA). The Commissioner emphasized that current federal privacy laws are outdated and insufficient for addressing modern data risks, particularly those posed by AI systems and digital platforms.
The OPC proposes four strategic privacy priorities:
Protecting children’s privacy.
Enabling responsible AI innovation.
Supporting reconciliation with Indigenous peoples through respectful data governance.
Promoting privacy in the workplace.
Focus on AI Oversight and Algorithmic Transparency
The OPC advanced work on AI governance by:
Publishing draft guidance on algorithmic explainability, which sets expectations for transparency and fairness in automated decision-making.
Collaborating with international partners through the Global Privacy Assembly and G7 on AI oversight strategies.
Conducting consultations on the appropriate regulatory role for the OPC under the proposed AIDA.
In particular, the OPC highlighted the need for ex ante compliance mechanisms, not just enforcement after harms occur, to ensure AI systems uphold privacy rights by design.
Enforcement and Advice Activity
The OPC handled over 1,000 complaints and issued several decisions on social media practices, online advertising, and biometric data use. It also reviewed nearly 700 privacy breach reports. Notable files included:
A follow-up on facial recognition deployments by police.
An investigation into location tracking via mobile apps.
Continued guidance for federal institutions on privacy impact assessments (PIAs).
The OPC also published new or updated guidance on:
De-identification practices.
Children's privacy rights.
Consent and meaningful choice in digital services.
***
Direct your questions to groundcontrol@kepler.consulting.
Until the next transmission, stay secure and steady on course. Ground Control, out.