Privacy & Cybersecurity #34
Minnesota Privacy Act | EU Code for GPAI | GDPR Simplification | CNIL TIA Guidance | ICO Updates on Ads & Cookies | UK Chronic Risks Review | Polish UODO on Data Anonymization
🇺🇸 Minnesota Comprehensive Consumer Data Privacy Act Effective July 31
The Minnesota Consumer Data Privacy Act (MCDPA) will take effect on July 31, 2025. The Act establishes a framework for consumer privacy rights and imposes new obligations on businesses operating in the state. Postsecondary institutions and nonprofit corporations have a deferred compliance date of July 31, 2029.
The MCDPA applies to entities conducting business in Minnesota or targeting products or services to Minnesota residents, if they meet either of the following thresholds:
Control or process personal data of at least 100,000 consumers annually (excluding data processed solely for payment transactions); or
Derive over 25% of gross revenue from selling personal data and process or control personal data of at least 25,000 consumers.
Small businesses as defined by the U.S. Small Business Administration are exempt from most provisions, but they must not sell sensitive data without consumer consent.
The law includes exemptions, covering entities regulated under HIPAA, GLBA, the Fair Credit Reporting Act, and other sector-specific laws.
Consumers are granted extensive rights under the MCDPA, including:
Right to access personal data processed about them.
Right to correct inaccuracies in their personal data.
Right to delete personal data concerning them.
Right to data portability to obtain their personal data in a usable format.
Right to opt out of:
Targeted advertising
Sale of personal data
Profiling in furtherance of decisions producing legal or similarly significant effects.
Controllers must provide mechanisms to facilitate these rights, including recognizing universal opt-out signals sent by technologies or platforms, ensuring ease of exercising opt-out rights.
Consumers have the right to appeal refusals to act on their requests, and controllers must respond to appeals within 45 days, extendable by an additional 60 days if necessary.
Businesses subject to the MCDPA will face operational and compliance duties, including:
Data Minimization and Purpose Limitation. Controllers must collect only data adequate, relevant, and reasonably necessary for disclosed purposes.
Consent for Sensitive Data. Explicit consent is required for processing sensitive data, including data revealing race, religion, health conditions, sexual orientation, and precise geolocation.
Privacy Notices. Controllers must provide accessible, clear privacy notices detailing:
Categories of personal data collected.
Purposes of processing.
Consumer rights and how to exercise them.
Categories of third parties with whom data is shared.
Data retention policies.
Privacy notices must be updated to reflect material changes and provided in each language in which the business offers products or services.
Contracts with Processors. The Act mandates specific contractual provisions with processors, ensuring confidentiality, security measures, and audit rights.
Data Security. Controllers must implement reasonable administrative, technical, and physical safeguards, proportionate to the volume and nature of data.
Data Privacy Assessments. Controllers must document data privacy and protection assessments for activities presenting heightened risk, including:
Targeted advertising
Sale of personal data
Processing of sensitive data
Profiling with significant effects
These assessments must balance benefits against risks to consumers and be available to the Attorney General upon request.
The MCDPA does not provide a private right of action. Enforcement lies exclusively with the Minnesota Attorney General, who may issue warning letters and impose civil penalties of up to $7,500 per violation.
A 30-day cure period applies until January 31, 2026, after which immediate enforcement may proceed.
Recommendations for Businesses
Conduct data mapping to identify personal and sensitive data processed in Minnesota.
Update or draft privacy notices to ensure they comply with MCDPA’s detailed requirements.
Review and revise contracts with processors to incorporate required terms.
Implement mechanisms for responding to consumer rights requests, including recognizing universal opt-out signals.
Develop data privacy and protection assessment procedures for high-risk processing activities.
Evaluate data security measures to ensure they are appropriate and documented.
Prepare staff and systems for the July 31, 2025 compliance deadline—or 2029, if operating as a nonprofit or postsecondary institution.
🇪🇺 European Commission Receives Final Code of Practice for General-Purpose AI
The European Commission has announced receipt of the final version of the voluntary Code of Practice for General-Purpose AI (GPAI), a milestone in the EU’s policy framework ahead of the AI Act’s application. The Code, developed under the EU’s GPAI “Codes of Practice” process initiated in 2023, is intended as a tool for GPAI providers and deployers to implement key obligations of the AI Act in advance of legal requirements coming into force.
The Code targets providers and deployers of GPAI models and systems, especially those likely to be designated as “systemic GPAI” under the AI Act due to their significant impact. It has been drafted collaboratively by GPAI developers, downstream deployers, civil society organizations, and academic experts, under facilitation by the Commission.
The final Code of Practice sets out voluntary commitments and practical measures for GPAI actors, including:
Risk Management and Safety: Processes to assess systemic risks associated with GPAI models and mitigation measures.
Cybersecurity: Practices to prevent and address cybersecurity vulnerabilities in GPAI systems.
Responsible Deployment: Guidelines for transparency towards downstream deployers and users.
Content Provenance: Commitments to adopt standards for watermarking and other methods of identifying AI-generated content.
Incident Reporting: Voluntary reporting channels for incidents linked to GPAI model misuse or unexpected behaviors.
Research Cooperation: Sharing information with researchers and authorities about GPAI system capabilities, limitations, and evaluation methods.
The Code is not legally binding but is designed to serve as an interim governance tool until the AI Act is fully enforceable. Notably, the AI Act foresees future Codes of Practice becoming legally recognized compliance mechanisms for GPAI providers designated as systemic.
This development reflects the EU’s regulatory focus on GPAI models, given their potential for widespread deployment and cross-sectoral impact. The Commission’s Digital Strategy notes that these models can “impact the functioning of the internal market” and pose unique governance challenges. The Code aligns with the AI Act’s risk-based framework and anticipates obligations around transparency, risk management, and cooperation with authorities.
The European Commission plans further engagement and updates regarding the practical use of the GPAI Code of Practice as part of the broader AI regulatory ecosystem.
🇪🇺 EU Proposes Simplified Record-Keeping for SMEs and Small Mid-Caps under GDPR
The European Data Protection Board (EDPB) and the European Data Protection Supervisor (EDPS) have issued a Joint Opinion on the European Commission’s proposed regulation to simplify certain GDPR obligations for small and medium-sized enterprises (SMEs) and newly-defined small mid-cap enterprises (SMCs).
The proposal, published on 21 May 2025, would revise Article 30(5) GDPR to raise the threshold for the record-keeping exemption from fewer than 250 employees to fewer than 750 employees, provided the processing is not likely to result in a high risk to individuals’ rights and freedoms under Article 35 GDPR. Key elements of the proposal include:
New Threshold: Enterprises and organizations employing fewer than 750 persons would no longer be automatically required to maintain records of processing activities unless their processing is likely to result in high risks. This is a significant shift from the current 250-employee threshold.
High-Risk Processing Exception: Despite the broader exemption, all organizations—regardless of size—must still maintain records if their processing activities are likely to result in a high risk. This aligns the exemption closely with obligations around conducting Data Protection Impact Assessments (DPIAs).
Extension to SMCs: The proposal introduces the definition of SMCs in Article 4 GDPR and explicitly extends Articles 40(1) and 42(1) GDPR (covering codes of conduct and certification) to SMCs, reflecting their specific compliance needs.
Removal of Categorical Exclusions: The current rule under Article 30(5) that excludes certain processing from the exemption—such as processing special categories of personal data (Article 9) or criminal data (Article 10)—would be deleted. However, processing such data may still be high risk, triggering the record-keeping obligation through the risk-based approach.
While the EDPB and EDPS support reducing administrative burdens for SMEs and SMCs, they stress that the simplification must not weaken data protection standards. They note several areas requiring clarification, including:
Why the new 750-employee threshold was chosen, given earlier drafts mentioned 500.
The need to ensure public authorities and bodies remain subject to record-keeping obligations, to preserve accountability.
Precise language in recitals clarifying that records are only required for processing activities likely to result in high risk, avoiding potential misinterpretation that any single high-risk activity triggers universal record-keeping obligations across all processing.
The authorities also underline the operational value of records of processing, not only for demonstrating compliance but for supporting broader GDPR obligations such as transparency, risk assessment, and responding to data subjects’ rights.
🇫🇷 CNIL Publishes Final Version of Its Guide on Transfer Impact Assessments
The French Data Protection Authority (CNIL) has published the final version of its guide dedicated to conducting “Analyse d’Impact des Transferts de Données” (AITD) — or Transfer Impact Assessments (TIAs).
The guide, initially released in draft for consultation in February 2023, aims to help data exporters determine whether the laws and practices of third countries ensure an essentially equivalent level of protection for personal data as guaranteed under the GDPR. The final version reflects feedback from stakeholders and incorporates clarifications and examples to assist organizations in implementing TIAs in practice.
Key elements of the final guide include:
Risk-Based Approach: The CNIL confirms that risk assessments should be specific, contextual, and focus on the likelihood and severity of potential interferences with data subjects’ rights in the destination country.
Legal Analysis: The guide emphasizes that exporters must examine both the legal framework and practices of public authorities in the third country, including laws enabling access to data for surveillance purposes, as well as safeguards and remedies available to individuals.
Practical Scenarios: The CNIL includes examples of transfer scenarios (e.g., intra-group transfers, outsourcing) and walks through how a TIA might be documented in each case.
Supplementary Measures: Where legal analysis identifies risks, the guide underscores the obligation to identify and implement supplementary measures — technical, contractual, or organizational — to ensure essentially equivalent protection.
Documentation and Accountability: The CNIL stresses the importance of keeping detailed records of the assessment process and decisions taken, as part of the accountability obligations under the GDPR.
Recommendations for Businesses
Organizations transferring personal data outside the EEA should review the CNIL’s guide to ensure their existing TIA methodologies meet regulatory expectations.
Data exporters should maintain thorough records of TIAs as part of their GDPR accountability framework, ready for inspection by supervisory authorities.
Businesses should reassess the effectiveness of any supplementary measures in light of evolving legal and geopolitical risks in recipient countries.
🇬🇧 ICO Consults on New Approach to Regulating Online Advertising and Updates to Cookie Guidance
The UK Information Commissioner’s Office (ICO) has launched two consultations relevant to organizations involved in online advertising and the use of storage and access technologies, such as cookies and similar tracking tools.
1. Call for Views on Regulating Online Advertising
On July 7, 2025 the ICO issued a call for views to inform its strategic approach to regulating online advertising practices. The regulator notes that, despite long-standing guidance and requirements under UK data protection and e-privacy law, concerns persist about intrusive advertising models and widespread non-compliance, particularly regarding consent practices and data sharing across adtech ecosystems. Key aspects of the ICO’s consultation include:
Understanding stakeholder perspectives on harms and risks to individuals from current online advertising models.
Exploring the role of privacy-enhancing technologies and alternative advertising models.
Examining how regulatory interventions could improve compliance without disproportionate impact on businesses.
Exploring how publishers might deliver privacy-preserving advertising to users who have not given consent, where the risks are demonstrably low.
The ICO also sets out its planned regulatory trajectory. In early 2026, it intends to publish a statement identifying advertising activities that are unlikely to trigger enforcement action under the Privacy and Electronic Communications Regulations (PECR). The ICO will outline safeguards it expects to see in place to reduce risks to users.
This is intended to facilitate new, privacy-conscious approaches to online advertising and to support the UK government’s development of planned secondary legislation. The government is considering amendments to PECR, including the creation of a new exception to the consent requirements for specific low-risk advertising purposes.
2. Consultation on New Chapter in Guidance on Storage and Access Technologies
Separately, the ICO has opened consultation on a proposed new chapter within its updated Guidance on the Use of Storage and Access Technologies, which addresses technologies like cookies, SDKs, device fingerprinting, and similar tools under Regulation 6 of PECR.
The July 2025 update reflects changes introduced by the Data (Use and Access) Act and includes a new chapter, “What are the exceptions?”, explaining the five exceptions to the prohibition on storing or accessing information on users’ devices.
The ICO notes that, apart from these updates, the guidance remains in draft form following its significant revision in December 2024 and will be finalized after a second consultation on the new chapter. The ICO plans to finalize this guidance following further consultation on the new chapter explaining the exceptions under PECR.
The consultation period for the new guidance chapter runs until 9 September 2025.
Recommendations for Businesses
Review current online advertising practices in light of the ICO’s call for views and consider contributing feedback, particularly if engaged in programmatic advertising, profiling, or data sharing.
Monitor developments regarding the ICO’s planned regulatory statement in 2026.
Assess compliance with PECR and the UK GDPR for all technologies used to store or access information on user devices, not just cookies.
Ensure consent mechanisms are granular, transparent, and easy to use, avoiding designs that pressure users into agreeing.
Follow the ICO’s consultation process to anticipate potential legislative changes and shifts in enforcement priorities.
🇬🇧 UK Government Publishes Chronic Risks Analysis, Highlighting Long-Term Cyber and Privacy Threats
On July 8, 2025 the UK Government published the Chronic Risks Analysis, a companion document to the National Risk Register 2025. The report provides an in-depth examination of long-term, systemic risks that could erode national security, economic stability, and societal well-being. Unlike the National Risk Register, which focuses on acute crises requiring immediate emergency response, this document identifies persistent, interlinked threats with potential to shape the UK’s risk landscape over decades.
Several sections in the Chronic Risks Analysis are directly relevant to privacy and cybersecurity practitioners:
Changing Nature of Cybersecurity Threats: The report underscores the increasing complexity and severity of cyber attacks, driven by ransomware, the expansion of cloud services, and AI-enabled offensive capabilities. The National Cyber Security Centre (NCSC) reports that 43% of UK businesses experienced a cyber incident within the past year. Growing use of IoT devices and remote working infrastructure has expanded the attack surface, while a shortage of cybersecurity skills continues to hamper resilience efforts.
Dominance of Global Technology Companies: The report raises concerns over market concentration among a handful of technology giants. These firms’ significant control over cloud services, data flows, and digital infrastructure can exacerbate systemic risks, restrict market competition, and increase the stakes of large-scale cybersecurity incidents or data breaches. Approximately 70–90% of the UK’s cloud computing market is dominated by two providers, leading to potential single points of failure.
Artificial Intelligence Risks: The analysis identifies advanced AI systems as a source of both opportunity and risk. While AI has transformative potential across sectors, it also introduces privacy risks through mass data collection and potential biases in automated decision-making. The possibility of AI-enabled disinformation campaigns, cyber attacks, and even bioweapon design highlights AI’s dual-use nature.
Reliance on Digital Platforms: The increasing dependence on digital services for commerce, government operations, and personal life creates cascading vulnerabilities. A growing digital divide leaves vulnerable populations at greater risk of exclusion, while widespread digital reliance raises the stakes for systemic disruptions due to cyber attacks or service outages.
Recommendations for Businesses:
Review enterprise risk registers to include chronic cyber threats and emerging privacy risks identified in the report.
Assess dependencies on major technology providers and explore diversification or contingency plans where feasible.
Evaluate AI use cases within the organization for privacy, security, and bias risks, and prepare for potential regulatory developments in AI governance.
Enhance incident response plans.
🇵🇱 Polish DPA Questions Supreme Court’s Anonymization Practices
On 9 July 2025, the Polish Data Protection Authority (UODO) announced that its President, Mirosław Wróblewski, has formally approached the First President of the Supreme Court of Poland, Dr. hab. Małgorzata Manowska, over concerns regarding documents published in a manner that could still allow individuals to be identified.
The DPA’s concerns relate to images of handwritten electoral protest letters that were posted on the Supreme Court’s website and social media channels. While names and other direct identifiers were redacted, the handwriting itself could enable identification of the individuals who submitted the protests. The UODO President emphasised that handwriting, as a feature that can be linked to a specific individual, constitutes personal data under the GDPR.
As a result, the President of the UODO has requested the Supreme Court to:
explain why such documents were published in this form;
provide details of the procedures used to anonymize documents prior to publication online and on social media, including methods for removing features that might enable identification (e.g., handwriting);
clarify whether the Data Protection Officer was involved in designing these procedures;
share the rules governing publication of content online and on social media that may contain personal data.
From a data protection perspective, this case serves as a reminder that anonymization does not end with redacting names and basic identifiers. Under Article 4 of the GDPR, personal data includes any information relating to an identified or identifiable natural person—including features such as handwriting that may indirectly enable identification. While the law does not set out precise technical instructions for anonymization, Poland’s Open Data and Re-use of Public Sector Information Act defines it as a process that ensures individuals cannot be identified, directly or indirectly, from the information published.
🇪🇸AEPD Fines Logistics Company for Collecting Criminal Records and Family Data During Recruitment
The Spanish Data Protection Authority (AEPD) has fined Plataforma Cabanillas S.A. a total of €100,000 for breaching the GDPR’s data minimization principle during hiring processes.
In July 2023, a job candidate complained to the AEPD that Plataforma Cabanillas required a criminal record certificate (certificado de antecedentes penales) as a condition for a job interview and potential employment and personal details about civil status and number of children during the selection process, ostensibly for future completion of Spain’s tax withholding form (Modelo 145).
Plataforma Cabanillas, a logistics company involved in air cargo operations, defended the practice, arguing that air security regulations demand pre-employment background checks for staff potentially accessing air cargo zones, and that gathering tax-related personal data early in the process improves administrative efficiency for onboarding.
The AEPD rejected these arguments, concluding:
Although EU aviation security regulations (Regulation (EU) 2015/1998) require background checks for certain roles, the AEPD clarified:
The regulation applies only to individuals already selected for posts involving access to air cargo areas—not to all applicants indiscriminately during the interview stage.
Companies should merely inform candidates upfront that employment will be conditional upon providing the required certificate later if selected. Proactively demanding the document from all interviewees violates the principle of data minimization.
The AEPD found no legal basis for collecting data on civil status and number of children before establishing an employment relationship:
Modelo 145 obligations arise only once employment exists. Collecting such personal data during recruitment exceeds what is necessary and infringes the GDPR’s purpose limitation and minimization requirements.
The AEPD considered the violations serious:
Plataforma Cabanillas is a medium-sized enterprise handling personal data systematically as part of recruitment.
The company should have exercised greater diligence given its professional operations and the sensitive nature of criminal data.
The Authority stressed that subsequent process improvements by the company, while positive, do not eliminate liability for past unlawful practices.
The AEPD imposed €75,000 fine for requiring criminal record certificates prematurely and €25,000 fine for collecting personal data for Modelo 145 before establishing employment.
Recommendations for Businesses
Review Pre-employment Screening: Only request criminal background checks where legally required and only once a candidate is selected, not during the initial application phase.
Limit Data Collection in Recruitment: Avoid gathering personal information unrelated to evaluating job suitability or required only after employment begins.
Transparency in Job Notices: Where certain documents will be mandatory if hired, inform candidates clearly in job postings to avoid creating misleading expectations.
Document Legal Basis: Maintain written records linking any data collected during recruitment to specific legal obligations.
***
Direct your questions to groundcontrol@kepler.consulting.
Until the next transmission, stay secure and steady on course. Ground Control, out.