KEPLER Privacy & Cybersecurity #4
Canada Cybersecurity; Bill C-26; Germany Employee Data Protection Act; FTC; Data Brokers; Consumer Financial Protection Bureau; Noyb; Mozilla; Firefox; Stripe Privacy Policy
🇨🇦 Canadian Cybersecurity Bill C-26 Returned with Amendments from Senate to House of Commons
Bill C-26, titled "An Act respecting cyber security, amending the Telecommunications Act and making consequential amendments to other Acts," introduced in June 2022, aims to bolster cybersecurity for critical infrastructure.
Passed by the House of Commons on June 19, 2024, it underwent Senate review starting September 19, 2024.
The Standing Senate Committee on National Security, Defence and Veterans Affairs reviewed the bill and presented a report with an amendment on December 3, 2024. The Senate adopted the committee's report on December 4, 2024, and passed the bill with amendments on December 5, 2024 for House of Commons consideration.
As of December 12, 2024, Bill C-26 is awaiting consideration of the Senate's amendments by the House of Commons. If the House accepts these amendments, the bill will proceed to receive Royal Assent, becoming law. If the House rejects or further amends the Senate's changes, the bill will return to the Senate for additional deliberation.
Senate's Amendments
The Senate made the following amendments to address these concerns:
Corrected a technical overlap between Bill C-26 and Bill C-70, ensuring no cybersecurity provisions were unintentionally removed.
Added stricter government reporting requirements for cybersecurity directives to enhance transparency.
Strengthened privacy safeguards by mandating compliance with the Privacy Act and related frameworks.
Industries targeted by the Bill:
Telecommunications: Major providers and their supply chains.
Finance: Banks, payment processors, insurance companies, and fintech.
Energy: Utilities, pipelines, renewable energy operators, and oil and gas companies.
Transportation and Logistics: Airlines, rail operators, port authorities, and logistics providers.
Healthcare: Hospitals and digital health platforms.
Water and Waste Management: Regional water utilities and waste treatment operators.
Manufacturing: Companies producing critical components for national security.
Implications of Bill C-26:
Designated operators managing critical cyber systems must implement cybersecurity programs, address supply chain risks, report cybersecurity incidents, and comply with government directives.
Mandatory record-keeping practices require operators to maintain data within Canada as specified by regulators.
The Governor in Council has the authority to designate vital services or systems and classify operators accordingly.
Non-compliance could result in penalties: fines up to $15 million for organizations and $1 million for directors or officers.
Smaller Companies and Third Parties:
Small and medium-sized enterprises (SMEs) that supply goods or services to critical infrastructure operators may be affected indirectly. These companies may face increased scrutiny and compliance requirements in their supply chain relationships.
Recommendations for Businesses:
Conduct a cybersecurity risk assessment to identify vulnerabilities.
Update cybersecurity policies to align with anticipated regulations.
Ensure readiness for regulatory audits and incident reporting.
🇩🇪 German Draft Employee Data Protection Act
Germany’s Draft Employee Data Protection Act (Beschäftigtendatengesetz, BeschDG) introduces comprehensive measures to modernize and regulate employee data processing in Germany, ensuring transparency, accountability, and the protection of fundamental rights in the workplace.
The Federal Ministry of Labour and Social Affairs (BMAS) and the Federal Ministry of the Interior (BMI) published the draft as a reference draft on October 8, 2024. The draft is undergoing consultations with stakeholders, including industry representatives, trade unions, and legal experts. Next step will be submission to the Bundestag (German Parliament) for debate and potential amendments.
Key provisions:
Data processing is allowed for specific purposes, such as hiring, fulfilling legal obligations, ensuring workplace safety, or protecting vital interests.
Profiling and AI-based decisions require transparency, safeguards, and human oversight.
Employees must be informed about data processing, particularly when AI or profiling is used.
New data subjects’ rights include explanations of profiling decisions, feedback on results, and requests for reviews.
Long-term surveillance is restricted to critical purposes, such as life safety or protecting high-value assets.
The law prohibits the use of profiling or surveillance for performance monitoring.
Emotion recognition and analysis of social relationships through communication data are explicitly banned.
Employers must adopt robust technical and organizational measures to minimize risks, including data pseudonymization and transparency about AI algorithms.
Violations can result in:
Prohibition of using unlawfully processed data as evidence in legal proceedings.
Enforcement actions and fines aligned with GDPR standards.
Recommendations for Businesses:
Ensure the person responsible for data protection in your company is involved early in planning any employee data processing measures, particularly profiling, AI use, or long-term surveillance.
Define and document the purposes for data processing, including the personal aspects being evaluated, the use of results, and the duration of processing.
Conduct DPIAs for high-risk processing activities, such as profiling and AI applications.
Clearly communicate the purpose of data processing, categories of data collected, any AI or profiling systems in use, including their logic, criteria, and potential outcomes.
Be prepared to provide explanations of profiling decisions, including human oversight and the role of data in decision-making.
Do not use profiling for emotion recognition or analyzing social relationships based on communication data.
Do not repurpose data collected for one purpose to evaluate employee performance.
Use transparent and explainable mathematical or statistical models.
Implement safeguards against discriminatory outcomes.
Ensure decisions based on AI or profiling include meaningful human review.
Limit long-term monitoring to critical purposes.
Explicitly avoid using monitoring data for performance evaluation.
Involve data protection officers in designing and implementing any surveillance measures.
Use pseudonymization, encryption, and secure storage to protect employee data.
Aggregate data where possible to minimize individual risk.
Train managers and HR teams on the new obligations.
Allocate resources for compliance.
🇺🇲 FTC vs. data brokers Gravy Analytics and Venntel for unlawful selling of geolocation data of consumers
The Federal Trade Commission (FTC) has taken action against data brokers Gravy Analytics and its subsidiary Venntel for violating consumer privacy laws by unlawfully collecting, selling, and using precise geolocation data without adequate consumer consent. The FTC alleged that these companies sold sensitive consumer location data to third parties, enabling tracking of movements to religious institutions, healthcare facilities, and political gatherings. These actions were deemed harmful, violating Section 5 of the FTC Act. The case sets a standard for data brokers.
Allegations:
Gravy Analytics and Venntel collected over 17 billion daily location signals from about a billion mobile devices.
Data was sold to private entities and government clients, often revealing highly sensitive patterns of consumer behavior, including visits to places of worship and health-related events.
The companies obtained data through third-party apps but failed to verify that consumers had consented to such use.
Consent notices were often misleading or absent, leaving consumers unaware of how their data was being used.
Gravy Analytics derived insights from location data to create profiles based on sensitive attributes such as religion, political affiliation, and medical conditions. These profiles were sold to advertisers and other entities.
The FTC issued a Decision and Order prohibiting Gravy Analytics and Venntel from:
Selling or using location data tied to sensitive sites without express consent.
Continuing data collection practices without rigorous supplier assessments ensuring proper consumer consent.
Retaining or using sensitive data collected before the enforcement order unless it is deidentified.
The order mandates a robust "Sensitive Location Data Program," regular supplier assessments, and transparent mechanisms for consumers to request deletion of their data from company databases.
Recommendations for Businesses:
Clearly communicate to data subjects what data is collected, why, and how it will be used. Avoid vague or buried disclosures in privacy policies.
Use affirmative mechanisms for consent (e.g., opt-in checkboxes) and avoid consent via default or passive methods like pre-checked boxes.
Only collect data necessary for your operations or services. Avoid collecting sensitive data unless explicitly required.
Implement additional safeguards for sensitive data such as geolocation, health, or financial information.
Conduct thorough assessments of third-party data providers or partners to ensure they comply with privacy laws and ethical data practices.
Include clauses requiring partners to obtain consumer consent and prohibit the misuse or resale of data.
Ensure data is deidentified whenever possible and prevent reidentification through technical safeguards.
Limit access to sensitive data to only those employees or partners who need it.
Train employees on privacy best practices and data security protocols.
Appoint a Data Protection Officer (DPO) or Privacy Officer to oversee compliance and respond to breaches.
Provide mechanisms for consumers to view, delete, or modify their data.
Notify users promptly if their data is shared or used for a new purpose beyond what they initially consented to.
Avoid creating consumer profiles based on sensitive attributes such as religion, health, or political affiliation.
Implement policies to ensure marketing practices based on consumer data respect privacy and avoid causing emotional or reputational harm.
Develop and regularly test a data breach response plan.
🇺🇲 FTC vs. data broker Mobilewalla
The Federal Trade Commission (FTC) has initiated enforcement actions against Mobilewalla, a data broker, for its collection, sale, and misuse of consumer location data without proper consent. Mobilewalla's practices enabled tracking of individuals to sensitive locations, profiling based on personal characteristics, and the creation of audience segments that could lead to stigmatization and discrimination.
Key Allegations:
Mobilewalla collected location data from real-time bidding (RTB) exchanges and third-party data brokers.
Data included precise geolocation linked to mobile advertising identifiers (MAIDs) without proper consumer consent.
Mobilewalla created audience segments, including sensitive categories such as pregnant women, LGBTQ+ individuals, and religious attendees.
It also used geo-fencing to track specific groups and events, such as political rallies and union organizers.
Mobilewalla retained sensitive location data indefinitely, enabling clients to track individuals and infer private details such as medical conditions, political affiliations, and sexual orientation.
The company failed to verify whether data suppliers had obtained consumer consent for data collection.
Mobilewalla retained data from RTB exchanges even when it did not win advertising bids, violating exchange policies and FTC standards.
The FTC issued a Decision and Order requiring Mobilewalla to:
Cease and desist from collecting, using, or selling sensitive location data without clear, affirmative consumer consent.
Develop a comprehensive list of sensitive locations.
Prevent the use or disclosure of data associated with such locations.
Ban the use of location data to determine private residence addresses or for purposes unrelated to consumer consent.
Verify supplier compliance with consent requirements.
Provide clear mechanisms for consumers to withdraw consent or request deletion of their data.
Establish strict timeframes for data deletion and prohibit indefinite retention.
Notable FTC Findings:
Mobilewalla’s practices likely caused substantial harm to consumers, including emotional distress, reputational harm, and increased risks of discrimination and physical violence.
The company’s lack of safeguards and indefinite data retention amplified privacy risks.
Recommendations for Businesses:
Clearly disclose data collection purposes and obtain explicit consumer consent.
Avoid using default or passive consent mechanisms.
Collect only essential data and avoid sensitive categories unless explicitly necessary.
Regularly audit data collection processes to ensure compliance.
Conduct regular assessments of data suppliers.
Establish clear contractual obligations for compliance with privacy laws.
Provide users with mechanisms to view, delete, or withdraw consent for data usage.
Notify consumers if their data is shared or used for new purposes.
Define clear retention periods for all collected data.
Avoid indefinite storage to mitigate long-term privacy risks.
Implement robust safeguards to prevent misuse or reidentification of sensitive data.
Prohibit targeting based on sensitive characteristics.
🇺🇲 CFPB Targets Data Brokers
The Consumer Financial Protection Bureau (CFPB) has proposed a rule aimed at stopping data brokers from selling sensitive personal information without consumer consent. The rule targets practices that expose individuals to risks like scams, stalking, and invasive surveillance. This move seeks to extend protections under the Fair Credit Reporting Act (FCRA) to data brokers, holding them accountable for activities that compromise consumer privacy and safety.
The Fair Credit Reporting Act (FCRA) is a U.S. law designed to ensure the accuracy, fairness, and privacy of consumer information collected by credit reporting agencies (CRAs). It governs how consumer data can be collected, used, and shared, with a strong emphasis on consumer consent and accuracy. CRAs must have a permissible purpose (e.g., credit evaluation, employment screening) before collecting or sharing an individual’s data. Data collected must be accurate, and consumers have the right to dispute inaccuracies in their records. Individuals can access and monitor their data through credit reports.
Data brokers typically operate outside the current scope of FCRA. They aggregate and sell personal information—such as geolocation data, online activity, and consumer behavior—without the stringent oversight applied to CRAs. The CFPB’s proposed rule aims to change this by applying FCRA-like requirements to data brokers, including obtaining explicit consumer consent before collecting or selling sensitive data, collecting the data only for legitimate, pre-approved purposes, such as fraud prevention or verified business needs
By extending FCRA-like standards, the CFPB’s rule would hold data brokers accountable for similar misconduct and protect consumers from privacy violations and potential harm.
Key points of the proposed rule:
Data brokers will require explicit consumer authorization before collecting or selling personal information.
Restrictions will apply to data revealing personal attributes such as financial behavior, geolocation, and browsing histories.
The rule aims to prevent misuse of data for stalking, fraud, or discriminatory profiling.
Brokers must ensure the accuracy of data and will face stricter compliance obligations to protect consumers.
Connection to Mobilewalla and Gravy Analytics cases:
The CFPB’s initiative aligns with enforcement actions by the FTC against Mobilewalla and Gravy Analytics, which exposed risks associated with data broker practices.
Recommendations for Businesses:
Conduct assessments of your data providers to ensure they comply with privacy laws and ethical data practices.
Include clauses requiring partners to obtain consumer consent and prohibit the misuse or resale of data.
Businesses in the data ecosystem should prepare for stricter compliance obligations under both CFPB and FTC scrutiny.
Mozilla Faces Privacy Complaint Over Firefox Tracking Feature
The privacy advocacy group NOYB (None of Your Business) filed a complaint with the Austrian Data Protection Authority, claiming that Mozilla (the maker of the Firefox browser) violated the General Data Protection Regulation (GDPR) by enabling a feature called "Privacy-Preserving Attribution" (PPA) without proper user consent or sufficient transparency.
NOYB, short for None of Your Business, is a non-profit privacy advocacy organization founded by Max Schrems.
PPA, introduced in Firefox version 128, tracks user interactions with online ads, such as views or clicks, and sends anonymized reports to advertisers. NOYB alleges the following key issues:
The PPA feature was turned on by default, requiring users to opt out instead of opting in.
Mozilla did not adequately inform users about the feature or include it in their privacy policy.
NOYB argues that Mozilla lacks a GDPR-compliant justification for processing personal data under PPA.
Why It Matters:
Mozilla has long been seen as a privacy-first company. This complaint raises questions about whether "privacy-preserving" technologies like PPA truly align with GDPR standards.
While PPA aims to reduce traditional tracking by processing data within the browser and adding privacy safeguards, it still involves data sharing, potentially contradicting GDPR's strict consent and transparency requirements.
GDPR prioritizes consent through opt-in mechanisms for personal data processing. By defaulting PPA to "on," Mozilla may have undermined user autonomy.
Stripe Updates Privacy Policy
Stripe's updated Privacy Policy, effective January 16, 2025, introduces clarifications regarding data usage and incorporates jurisdiction-specific provisions to align with evolving global regulations.
What was updated:
Explanations on how personal data is utilized to analyze, update, and improve Stripe's products, including those powered by machine learning. The policy provides detailed explanations on how Stripe utilizes personal data to analyze, update, and enhance its products, including those powered by machine learning. This includes the use of personal data to train models employed for fraud and loss prevention, as well as to assess the performance of Stripe's products.
Modifications made to comply with new regulations and guidance across different regions.
***
Direct your questions to groundcontrol@kepler.consulting.
Until the next transmission, stay secure and steady on course. Ground Control, out.