top of page

2022 Data Privacy Fines, Lawsuits and Rulings Recap

2022 has been a pivotal year in the Privacy and data governance world. With new regulations being introduced and existing ones being strictly enforced, privacy professionals everywhere have had to actively ensure that their company’s processes stay well within the legal requirements.

Given that 2023 will see the introduction of new privacy laws, especially within the United States of America, there are important takeaways to consider from the nuances of legal decisions and rulings of past cases.

In 2022 we saw breaches of regulations in various forms, from storage limitation to lawful processing. Further, companies have been investigated and fined by a number of different Data Protection Authorities from the CNIL (French DPA) to Garante (Italian DPA)

The following are some of the noteworthy lawsuits, fines, cases and their rulings seen in this calendar year

1. Enel Energia | Garante | January 2022

Enel Energia fined €26.5 million euros by the Italian DPA, Garante, for numerous violations of the GDPR and the Personal Data Protection Code

Summary: Following numerous complaints from recipients of unwelcome promotional phone calls, and regarding the lack of and inadequate DSA request facilities, among other complaints, Garante launched an investigation into Enel Energia spanning two years.

Outcome: Garante found Enel Energia to have violated Articles 5(1)(a), 5(1)(d), 5(2), 6(1), 12, 13, 21, 24, 25(1), 30, and 31 of the GDPR and Articles 130(1), 130(2), 130(4) of the Personal Data Protection Code. Enel Energia engaged in aggressive telemarketing techniques, lack of accountability regarding promotional phone calls, non-response to users’ requests, lack of cooperation with supervisory authority among numerous others.

Key Takeaways: Practices should be in place to protect and respect user privacy at every level, including receiving informed consent, adopting transparent, accountable and accurate practices, timely and appropriate responding to user requests and cooperation with supervisory authority.

2. Google | November 2022 Google to pay $391 million settlement fine to 40 states in the United States of America for misleading users about the tracking and collection of their location data

Summary: Following a 2018 report by the Association Press, an investigation headed by Oregon and Nebraska, found that Google continued to track the location of users even after they turned off their ‘location history’ in their settings. However, data was still being collected by other services such as ‘Maps’, and ‘Search’ and other apps connected to the internet.

Outcome: Further to the fine settlement, Google is required to be more transparent and forthcoming with their processes by providing detailed information about location tracking on a dedicated web page. Additional information is to be provided on enabling and disabling location settings.

Key Takeaways: User’s should be made privy to processes relating to the collection and processing of their personal data. User’s privacy decisions should be respected, and companies must have measures in place to protect the same.

3. Sephora | California OAG | August 2022 The California AG’s first formal complaint of non-compliance with the CCPA was against French retailer Sephora which resulted in the latter having to pay $1.2 million to resolve claims.

Summary: The complaint was filed as Sephora allowed third parties to collect personal information of its users via cookies which the OAG argued amounted to ‘Sale of information’, breaching the CCPA. The company did not disclose its online and offline practices regarding the collection, use, sale, sharing and retention of personal information. It failed to post the ‘Do not sell my Personal Information’ link on the website and mobile apps for opt-out requests. Further they did not process user-requests to opt-out of sale using user-enabled global privacy controls

Outcome: As part of the settlement, Sephora will have to 1. Disclosing its privacy policy and providing clarity on its sale of personal information. 2. Provide mechanisms to opt-out of sale of PI, including via GPC 3. Ensuring its Service provider agreements conform to the CCPA 4. Conduct regular compliance assessments and provide reports to the AG regarding its compliance measures.

Key Takeaways: Companies should ensure clear and defined contracts detailing the role of third parties when selling of Personal Information is involved (the definition of ‘Sale’ under Section 1798.140 (t) of the CCPA: “Sell”, “selling”, “sale” or “sold” means selling, renting, releasing, disclosing, disseminating, making available, transferring, or otherwise communicating orally, in writing or by electronic or other means, a consumer’s personal information by the business to another business or third party for monetary or other valuable consideration)

Note: Data sharing in exchange for analytics or ad serving will be considered as “sale” under the CCPA. Measures should be in place to honour users opt-out requests. The law requires companies to allow opt-out via a user-enabled GPC.

4. Kochava Inc. | Federal Trade Commission | August 2022 FTC Sues Kochava Inc for selling geolocation data of users, tracking their movements to and from sensitive locations

Summary: The FTC alleges that the Idaho based company’s handling of consumer data allows purchasers to ‘identify and track specific mobile device users’ and lacks adequate measures to protect their data from public exposure. Users location was tracked to and from sensitive locations such as reproductive health clinics, places of worship, homeless and domestic violence shelters, addiction recovery shelters.

Outcome: The outcome of the lawsuit will be updated here as proceedings take place.

Key Takeaways: Sensitive consumer data should be handled with special care as specified by privacy regulations including geolocation and health data.

5. Clearview AI Inc | ICO | 2022 The British independent data privacy authority, the Information Commissioners Office fined the American facial recognition company, Clearview AI Inc for its failure to comply with the nations data protection laws.

Summary: The facial recognition company collected images of people’s faces as well as publicly available information from the internet and social media platforms to create their online database. This was done without the knowledge or consent of the individuals.  

Outcome: ICO found that Clearview AI   - Did not use data in ways that were fair or transparent.  - Could not provide lawful reasoning for the collection of peoples data   - Did not possess or utilise processes to stop the information being retained indefinitely   - Failed to meet data protection standards as required by the GDPR for biometric data   The company was fined 7.5m pounds and was ordered to stop collecting and using personal data found publicly available and to delete the data of UK residents from their systems.  

Key Takeaways: Personal data cannot be collected unlawfully and further only retained for the period that is regulatorily compliant. Further, special category data such as biometric data should be handled and protected in accordance with the regulations.  

Note: Clearview AI has also been fined by CNIL and Garante for non-compliance with data protection regulations.

6. Meta Platforms Ireland Limited | Ireland’s DPC | March 2022 Ireland’s DPC imposed a fine of €17 million on Meta Platforms Ireland limited (previously Facebook Ireland limited) for failing to implement measures to protect EU users’ data

Summary: The Data Protection Commission launched an investigation into Meta Platforms following complaints of data breaches. Since the issue involved people from various EU jurisdictions, the matter was constituted as cross-border processing of personal data

Outcome: The investigation concluded that Meta Platforms failed to implement appropriate technical and organizational measures to demonstrate security measures implemented to protect data of EU users in matters regarding personal data breaches. The company was found that the company infringed on Articles 5(2) and 24(1) of the GDPR.

Key Takeaways: Personal data should be collected for specified, explicit and legitimate purposes and not further processed in a manner that is incompatible with those purposes. Further, measures must be implemented to protect personal data from any and all threats throughout its lifecycle.

7. REWE International | Austrian DPA | January 2022 REWE International was fined €8 million by the Austrian DPA for the careless handling of customer data by the company’s customer loyalty and rewards program, jö Bonus Club.

Summary: REWE International’s subsidiary, jö Bonus Club breached the GDPR by collecting user data without their consent and using it for marketing purposes.

Outcome: REWE plans to challenge the decision, arguing that jö Bonus Club operates independently as a separate subsidiary, Unser Ö-bonus Club, and hence should be fined and not REWE International. Further, they state that jö Bonus Club has not passed any data to the parent company and hence the parent company cannot be held responsible for the mishandling of customer data. However, legal experts doubt the success of REWE’s appeal for a number of reasons, namely the gravity of the violation, lack of security measures and even the definition of a controller of data, as defined in Article 4 of the GDPR.

Key Takeaways: The decision to fine REWE International highlights the fact that parent companies are responsible for the data privacy of its subsidiaries customers, even if the latter operates entirely on its own.

8. Google LLC | Spanish DPA | May 2022

Post investigation, the Spanish DPA, the AEPD imposed a fine of 10mil euros on Google for violating the GDPR by transferring data to a third party without legitimacy and hindering the consumers right to delete

Summary: Google transferred removal of content requests from its various platforms such as Google Search Engine and YouTube to a third party, the Lumen Project. This included their identification, email address, reasons stated, and the URL claimed. Further, the forms used for the submission of the request to remove content, did not provide any facility or option to exercise the right to erase personal data or oppose to its transfer

Outcome: The investigation found that 1. The requestors were not informed of the transfer of information to the Lumen Project and subsequently rejected their ‘Legitimate interests’ claim, especially considering Google’s privacy policy which states that the company does not share information outside of Google unless consented. 2. The company did not provide facilities to exercise the right to erase personal data 3. The company did not provide facilities to oppose to the transfer of personal data. The company was fined and ordered to hereby comply with the privacy regulations and delete the personal data so requested by the users.

Key Takeaways: Data transfer to third parties cannot occur without a legitimate basis and proper informed consent. Adequate facilities must be provided to oppose the transferring of personal data to a third party, and the facility to exercise the right to erase information from third party databases.

9. INFOGREFFE | CNIL | 2022 The French data protection authority, the Commission nationale de l'informatique et des libertés fined the economic interest group INFOGREFFE for noncompliance with the GDPR in relation to retention and security of data  

Summary: The INFOGREFFE website states that personal information collected would be kept for 36 months from the last order for a service and/or document. However, upon investigation, this was not found to be the case for 25% of the users. Manual anonymization that was done on individual requests concerned a small number of accounts. Further, sufficient measures were not in place to guarantee the security of the members data.  

Outcome: Following a complaint and an investigation by the CNIL, the authority issued a fine of 250,000 euros on INFOGREFFE for  - Failing to comply with the obligation to keep data for a period of time proportionate to the purpose of the processing  - Failure to comply with the obligation to ensure the security of personal data 

Key Takeaways: Data should be retained only for the period of time proportionate to the purpose of processing and appropriate measures must be followed to ensure the security of personal data. 

10. Cosmote and OTE | Hellenic Data Protection Authority | January 2022 Cosmote failed to justify the retention of all traffic data for a period of three months.

Summary: Telecom operator Cosmote retained all call data from customers for a period of 90 days, which was later anonymized and retained for a year in order to improve the company’s services. HDPA’s investigation also found that the data processing for data analytic purposes could have also been fulfilled using anonymized data. While Cosmote claimed that the data was anonymized, HDPA found the data in question was actually pseudonymized. The company failed to be transparent about the reason for retaining the data, the purpose of data processing and lacked sufficient measures to protect the data. In addition, Cosmote and OTE were found to have undertaken security measures without any specific data processing agreement between the two.

Outcome: The HDPA ruled that Cosmote’s data processing violated the data minimization and storage limitation principles of the GDPR and the provisions of Article 6 of the Greek E-Privacy Law 3471/2006. As a result, Cosmote and its parent OTE were fined €9.25 million (the largest fine in Greece). Cosmote was also ordered to stop further illegal data processing and destroy the collected data.

Key Takeaways: Personal data should be held only for as long as necessary in relation to the purposes for which it is being processed and shall be processed in a manner that ensures appropriate security of the personal data.

Link to full ruling: 11. CafePress | Federal Trade Commission| March 2022 CafePress failed to safeguard the sensitive personal data of its consumers and covered up a breach.

Summary: CafePress suffered a data breach in February 2019, exposing email addresses and passwords, unencrypted names, physical addresses, and security questions and answers, unencrypted Social Security numbers, and partial payment card numbers along with their expiration dates. After being notified about the vulnerability and the breach, the company patched the vulnerability but didn’t investigate the incident for some time despite additional warnings. It was given a warning in April 2019 by a foreign government regarding a breach and was asked to intimate the affected customers. CafePress did not inform the affected customers about the breach until September 2019.

In addition, it used consumer email addresses for marketing without providing appropriate disclosures of purpose limitations.

Outcome: The FTC ruled that CafePress failed to employ the necessary security controls to protect the sensitive information of buyers and sellers stored on its network. The company also stored Social Security numbers and password reset answers in clear, readable text and retained the data for longer than necessary. It also didn’t apply readily available protections and patches to well-known vulnerabilities. These practices led to multiple data breaches.

Key takeaways: 1. Personal data should be deleted after fulfilling the purpose for which it was collected 2. Timely notification to affected customers of data breach 3. Ensure personal data is stored in encrypted format as per the encryption standards 4. Have an incident response plan in place 5. Address security vulnerabilities on a regular basis

12. ITMedia Solutions LLC | Federal Trade Commission | January 2022 ITMedia Solutions settled with FTC for $1.5 million for misusing and sharing of consumer financial data

Summary: The California based lead-generation company which operates hundreds of personal loan and payday loan websites, was found in violation of the Fair Credit Reporting Act and Section 5 of the FCT Act by collecting and selling personal information of consumers without informed consent by claiming to connect consumers with lenders while completing online loan applications, the information shared included Social security numbers and bank account information.

Outcome: As part of the settlement, the company: - 1. Is prohibited from making misleading statements to customers, especially in regard to how their personal information will be used. 2. Is prohibited from selling personal information other than under specific circumstances. 3. Is required to screen recipients of the personal information shared

Key takeaways: Companies are required to be forthcoming and transparent about their proceedings. Misleading, deceiving statements about privacy practices are prohibited and are a breach of privacy regulations.

13. Controller | Belgian DPA | November 2022 The Belgian DPA ordered an employer to erase the data of a former employee after she filed a complaint.

Summary: More than 6 months after her dismissal, a former employee (data subject) made a request to her former employer (controller) to have her picture and details removed from the company website. When the company did not respond to her request, she filed a complaint with the DPA.

Outcome: The DPA found that her data remained on the website for 7 months (between the dismissal and filling of the complaint), which was deemed excessive by the DPA.

The DPA found that the purpose of processing of the data subject’s data was no longer necessary after her dismissal and hence had to be erased and should have been done on the controller’s own initiative. The DPA stated that when an employee leaves the job, information such as the identity, function and photographs of the individual should be erased from controller websites/social media pages.

Additionally, processes should be kept in place to address requests made by data subjects on such matters and should, at the very least, effectively follow up on claims within the required time period.

The DPA issued a warning against the controller to implement the appropriate changes in order to avoid violating the GDPR on such matters in the future.

Key Takeaways: Processes should be implemented to erase data of staff members after their leaving the office of the controller, to address erasure requests within the time frame and to follow up on claims made by data subjects.

14. Microsoft Ireland | CNIL | December 2022 Microsoft’s search engine Bing was fined €60 million by CNIL for imposing advertising cookies on users

Summary: Following a complaint and several investigations conducted by French DPA, CNIL, Microsoft’s Bing was found in violation of privacy regulations for imposing cookies on users. When users visited the site, cookies were deposited on their terminal without their consent and were used for advertising purposes among other things. Further, there was no button to disable the cookies as easily as users could accept them.

Outcome: The CNIL found the company in violation for

1. Depositing cookies without prior consent 2. The absence of an equivalent solution to refuse/disable the cookies A further periodic penalty payment was issued promoting the privacy of individuals residing in France.

Key Takeaways: Consent should be obtained from users prior to the depositing of cookies. Further no dark patterns should be employed in ad tech; it should be as easy to refuse cookies as it is to accept them.

15. Epic Games | FTC | December 2022 Fortnite maker Epic Games was fined $520 million for violating children’s privacy rights and item shop charges

Summary: The Federal Trade Commission found Epic Games in violation of Children’s Online Privacy Protection Act (COPPA) for employing dark patterns to trick users into making unintentional purchases. Further, the FTC addressed the Epic Games’ live text and voice communication features that exposed children to online harassment and abuse.

Outcome: The $520 million fine imposed was divided in two settlements. The COPPA fine amounting to $275 million and FTC fine amounting to $245 million for ‘dark patterns and billing practices’

Key Takeaways: Processes employed should be clear, informed and transparent. Dark patterns should be avoided at all costs. Strong practices should be put in place to safeguard the rights of children.


Featured Posts

Recent Posts

Follow Us

  • Facebook Basic Square
  • Twitter Basic Square
  • Google+ Basic Square
bottom of page