Privacy & Cybersecurity #26
EU AI Literacy | UK Software Security Code | UK Data Bill Progress | Estonia’s E-commerce Security Guidance | Belgian DPA Ruling on FATCA
🇪🇺 EU Clarifies AI Literacy Obligations Under Article 4 of the AI Act
The European Commission has released a detailed Q&A document addressing the AI literacy requirements stipulated in Article 4 of the AI Act. Article 4 of the AI Act came into application on 2 February 2025. This guidance aims to assist AI system providers and deployers in understanding and implementing measures to ensure that individuals involved with AI systems possess adequate knowledge and skills.
What is the AI Act?
The AI Act is a comprehensive regulation adopted by the European Union in 2024, aimed at ensuring that artificial intelligence (AI) systems placed on the EU market are safe, transparent, and respect fundamental rights. It is the world’s first major horizontal legislation on AI and part of the EU’s Digital Strategy. The AI Act applies extraterritorially—meaning it covers not only EU-based companies but also non-EU providers and deployers of AI systems if their systems are used in the EU. It affects AI system providers (developers and manufacturers), deployers (users of AI systems in professional contexts), importers and distributors, General-purpose AI model providers (e.g., foundation models like GPT).
The Act follows a risk-based approach, classifying AI systems into four categories:
Unacceptable risk: Prohibited systems (e.g., real-time biometric surveillance in public spaces, social scoring).
High risk: Strict obligations for AI used in critical sectors like healthcare, transport, employment, education, law enforcement, or migration.
Limited risk: Transparency obligations.
Minimal risk: Most AI applications are largely unregulated under the Act.
The AI Act entered into force in June 2024. Banned practices will be prohibited six months later, by December 2024. General-purpose AI model obligations will apply after 12 months (by June 2025). And high-risk system requirements (e.g. conformity assessments, documentation, risk management) will become enforceable after 24 months, by June 2026.
The AI Act is directly applicable in all EU Member States without the need for national implementing legislation. Once it enters into force, it has binding legal effect across the EU according to its timeline. Member States are expected to designate or establish national supervisory authorities to enforce the Act, and these authorities will need to coordinate with the newly created European AI Office.
Article 4 of the AI Act mandates that providers and deployers of AI systems must ensure a sufficient level of AI literacy among their staff and any other individuals operating or using AI systems on their behalf. This encompasses:
Definition of AI Literacy: As per Article 3(56) of the AI Act, 'AI literacy' refers to the skills, knowledge, and understanding that enable individuals to make informed decisions about deploying AI systems and to be aware of the associated opportunities and risks.
Target Audience: The requirement extends beyond direct employees to include contractors, service providers, and clients who interact with AI systems.
Assessment of AI Literacy: While there is no explicit obligation to measure employees' AI knowledge, organizations are expected to consider the technical knowledge, experience, education, and training of their personnel when implementing AI literacy measures.
The Commission emphasizes a flexible, context-specific approach to AI literacy:
Organizations should tailor their AI literacy initiatives based on their role (provider or deployer) and the risk level of the AI systems they handle.
There is no prescribed format for AI literacy programs. Organizations may choose various methods, such as training sessions, workshops, or informational materials, to enhance AI literacy.
While no specific industries are singled out, the context in which AI systems are used, including sector and purpose, should inform the development of AI literacy initiatives.
To support organizations in meeting AI literacy requirements, the Commission has:
Living Repository: A collection of AI literacy practices from various organizations is available, providing examples and inspiration for implementing AI literacy programs.
Webinars and Events: The Commission organizes events, such as the AI Pact webinars, to discuss AI literacy and share best practices.
Support for SMEs: European Digital Innovation Hubs (EDIHs) offer services, including training and workshops, to assist SMEs and public sector organizations in enhancing AI literacy.
Further Information: AI Talent, Skills, and Literacy
🇬🇧 UK Government Publishes Software Security Code of Practice
On May 7 2025 the UK Department for Science, Innovation and Technology (DSIT), in collaboration with the National Cyber Security Centre (NCSC), released a voluntary Software Security Code of Practice, aimed at improving the security and resilience of software supply chains. This initiative is part of a wider effort to address systemic risks posed by insecure software development practices and to encourage consistent security baselines across the commercial software sector.
The Code consists of 14 principles grouped into four thematic areas:
Secure Design and Development
Build Environment Security
Secure Deployment and Maintenance
Customer Communication
Each principle outlines expected controls and process outcomes that software vendors should implement. The Code applies primarily to:
Software developers and distributors (full scope),
Software resellers (limited scope: principles 3–4),
In-house software developers (principles 1–3 where applicable).
While not legally binding, the Code sets clear baseline expectations and aligns with international initiatives such as:
The US NIST Secure Software Development Framework (SSDF),
The EU Cyber Resilience Act.
Key Principles
1. Secure Design and Development
Implement an established secure development framework.
Assess risks from third-party components.
Apply "secure by design" and "secure by default" principles across the software lifecycle.
2. Build Environment Security
Protect build environments from unauthorized access.
Log and control changes in the build environment.
3. Secure Deployment and Maintenance
Distribute software securely.
Maintain a clear vulnerability disclosure process.
Provide timely patches and security updates.
Notify relevant parties of vulnerabilities where appropriate.
4. Customer Communication
Clearly define support timelines and end-of-life notifications (at least 1 year in advance).
Disclose significant security incidents that may impact customers.
A self-assessment template is available to help vendors evaluate their compliance with the Code. The assurance framework is based on the Principles-Based Assurance (PBA) model developed by the NCSC. Vendors may be asked by customers to share their completed self-assessments as part of procurement processes.
Future plans include a certification scheme based on the Code, though it is currently in development.
The Code designates a Senior Responsible Owner (SRO) at the executive level within vendor organisations to oversee implementation. It also emphasizes the need for qualified and trained personnel to execute secure development practices, and references relevant support initiatives such as NCSC-certified degree programs and UK Cyber Security Council professional pathways.
🇬🇧 UK Data (Use and Access) Bill Enters Final Parliamentary Phase
The UK’s Data (Use and Access) Bill is nearing the final stage of the legislative process. Following completion of the Commons Report Stage on 7 May 2025, the Bill is now in the "consideration of amendments" phase as the House of Lords prepares for debate on 19 May 2025.
The Bill aims to modernize data use across the UK. Its key components include:
Smart data expansion beyond finance (open banking) to other sectors.
Digital ID regulation through a new trust framework and register.
Statutory basis for the National Underground Asset Register (NUAR).
Digitization of civil registration (births and deaths).
Changes to UK data protection law, including processing for research.
Abolition of the ICO, replaced by a new “Information Commission”.
Sectoral rules for data use in health, energy, public services, and online safety.
During Commons committee and report stages, several Lords-originated provisions were removed:
Clauses requiring transparency around the use of copyrighted works to train AI models were deleted, with the government citing financial privilege and ongoing consultation results.
Requirements to verify the accuracy of personal data in public systems were removed as redundant or privacy-invasive.
A proposed “public interest” test for processing personal data for research was rejected to avoid chilling effects on curiosity-driven research.
A proposal to raise the age of consent from 13 to 16 was rejected.
After the government blocked a Lords amendment requiring AI firms to declare the use of copyrighted materials in training datasets, a revised version has been submitted by Crossbench peer Baroness Kidron. This softer version would allow the government to optionally introduce transparency requirements ("may" rather than "must").
The House of Lords is scheduled to debate the new proposal on 19 May 2025.
The government argues that a broader, holistic legislative approach is preferable to addressing AI-related copyright via the Data Bill.
The Bill is in its final legislative phase, known as "ping pong," where the Lords and Commons reconcile differences. If the Lords insist on key amendments—particularly on AI and copyright—this could prolong passage or prompt new government concessions.
🇪🇪 Estonia’s Data Protection Authority Publishes Security Guidance for E-Commerce Operators
On 17 April 2025, the Estonian Data Protection Inspectorate (Andmekaitse Inspektsioon, AKI) released updated data security recommendations for e-commerce operators. The guidance document, titled Andmeturbe soovitused e-poodidele v.4.0, outlines procedural expectations for ensuring lawful processing and protection of personal data under the General Data Protection Regulation (GDPR) and Estonia’s Electronic Communications Act (ESS).
Legal Basis for Processing
AKI clarifies that personal data processing in online stores must rest on a legal basis under Article 6(1) GDPR, typically the performance of a contract. Additional purposes, such as sending newsletters or analytics via cookies, require separate, freely given consent. The privacy policy must clearly state:
The data controller’s identity and contact details;
The categories and purposes of data processing;
Legal bases for each processing operation;
Data recipients and retention periods;
Rights of data subjects under GDPR Articles 12–22.
Minimization and Purpose Limitation
Data collection should be limited to what is necessary and proportionate to the stated purpose. For instance, requiring a residential address when delivering to a pickup locker may exceed necessity.
Direct Marketing and Consent Requirements
For direct marketing via electronic contact details (e.g., email, SMS), consent is required under ESS §103. Consent must not be presumed, must be demonstrable, and withdrawal must be as easy as granting it.
Account Management and Authentication
Operators should:
Avoid requiring account creation unless essential.
Offer multi-factor authentication (2FA/MFA) for registered users.
Ensure secure password practices, including prohibiting default passwords and requiring periodic change.
Outsourcing and Processors
Third-party service providers processing personal data (e.g., payment processors, analytics services) must be bound by a data processing agreement (GDPR Art. 28). Operators retain responsibility for compliance of processors and should conduct periodic audits.
Use of Artificial Intelligence
Where AI systems are used (e.g., chatbots), they may qualify as processors. The guidance reiterates that such systems remain subject to data protection principles, including transparency, accountability, and data minimization.
Platform Security Controls
Recommended measures include:
Applying encryption for data in transit and at rest;
Regular software updates and patch management;
Monitoring and logging access to personal data;
Using secure design to avoid deceptive patterns that may hinder the exercise of data subject rights.
Cookies and Tracking Pixels
Non-essential cookies and tracking pixels require prior informed consent. The user must be informed about third-party access, duration, and purposes of tracking. Withdrawal of consent must be straightforward.
Incident Response and Notifications
In case of a personal data breach:
The controller must act promptly to contain the breach;
Assess the risk to individuals’ rights and freedoms;
Notify the Data Protection Inspectorate within 72 hours where required (GDPR Art. 33);
Notify affected individuals if the breach is likely to result in high risk (Art. 34);
Maintain documentation of all breaches, regardless of notification obligations.
Recommendations for Businesses
Establish a clear legal basis for all processing activities.
Design and publish a standalone Privacy Policy, avoid embedding privacy terms inside general Terms & Conditions. Include the following minimum elements: data controller identity and contact, purposes, legal bases, and retention periods, categories of personal data processed, rights of data subjects and how to exercise them, information about third-party recipients or international transfers.
Apply data minimization and purpose limitation.
Implement multi-factor authentication (MFA).
Enforce secure password practices.
Manage third-party vendors and processors: sign Data Processing Agreements (DPAs), Vet providers for security maturity and compliance, conduct regular audits and limit data access to what is necessary.
Ensure Cookie and Tracker Compliance and implement a compliant cookie banner: no pre-checked boxes, clear information on duration, purpose, and third parties, easy withdrawal of consent.
Secure dll data transfers and storage.
Develop an incident response plan.
Avoid deceptive design ("Dark Patterns").
Update and monitor web platforms Proactively.
Perform regular privacy and security Audits.
🇧🇪 Belgian DPA Rules on FATCA
On 19 March 2025, the Belgian Data Protection Authority (APD) issued Decision No. 79/2025, addressing the compatibility of Belgium’s implementation of the FATCA (Foreign Account Tax Compliance Act) intergovernmental agreement with the General Data Protection Regulation (GDPR). The proceedings were initiated upon complaint by a Belgian citizen with dual U.S. nationality.
FATCA is a U.S. federal law enacted in 2010 requiring financial institutions outside the United States to report financial account information of U.S. citizens to the IRS. Belgium implemented FATCA through an Intergovernmental Agreement (IGA) with the U.S., formalized by a 2016 law.
The case examined whether this legal arrangement and the resulting data transfers complied with GDPR.
The APD's decision followed a multi-phase investigation:
An initial investigation by the Inspection Service closed with no finding of GDPR violations;
A subsequent decision of the APD’s Litigation Chamber reopened the inquiry, mandating additional analysis;
A supplementary investigation was conducted, leading to the March 2025 ruling.
The decision also involved an assessment of the role and independence of the APD as a supervisory authority under Article 51 GDPR and relevant CJEU case law on data transfers.
The APD analyzed the situation under several GDPR provisions. Key areas of analysis included:
Purpose Limitation and Necessity (Articles 5(1)(b) and (c)). The decision discussed whether the purposes of FATCA-related processing were specific, explicit, and legitimate, and whether the scope of data collected and transferred was limited to what was necessary for those purposes.
Data Transfers to Third Countries (Articles 44–46). Particular attention was given to whether the FATCA IGA provided appropriate safeguards for personal data transferred to the United States. The decision examined:
The absence or presence of enforceable data subject rights in the recipient jurisdiction;
The availability of judicial or administrative redress mechanisms for EU data subjects;
Compatibility of the U.S. legal framework with the GDPR's standards on international transfers.
Accountability and Risk Assessments (Articles 24, 25, 35). The authority evaluated the extent to which the Belgian tax authority had implemented appropriate technical and organizational measures, including whether a Data Protection Impact Assessment (DPIA) had been conducted prior to the data processing activities.
Transparency Obligations (Articles 12 and 14). The investigation also addressed whether data subjects had been properly informed about the processing, its legal basis, and the implications of cross-border transfers.
The APD issued a reprimand to the Belgian Federal Public Service Finance and set a 12-month deadline for remedial actions. These include:
Conducting a GDPR-compliant DPIA;
Taking steps to ensure appropriate safeguards for international transfers;
Improving transparency and data subject notification mechanisms.
The decision referred to several key legal sources:
Article 5, 12–14, 24, 25, 35, 44–46 of the GDPR;
The Schrems II judgment (CJEU, Case C-311/18) on third-country transfers;
The Charter of Fundamental Rights of the EU, especially Articles 7 and 8;
The APD’s independence under Article 51 GDPR;
Belgian national law implementing FATCA (2016).
The APD also addressed the interaction between EU data protection law and the Belgian Constitutional Court’s prior decision on the legality of FATCA implementation.
This decision offers a detailed case study on the intersection between international tax cooperation mechanisms and EU data protection law. Key procedural and substantive takeaways include:
The importance of conducting and documenting DPIAs when systemic data transfers are involved;
The need to assess the ongoing adequacy of legal mechanisms for cross-border transfers, even if they stem from international treaties;
Supervisory authorities' role in reviewing not only private-sector processing but also state-to-state arrangements with data protection implications.
***
Direct your questions to groundcontrol@kepler.consulting.
Until the next transmission, stay secure and steady on course. Ground Control, out.