top of page

Key takeaways from significant 2023 privacy enforcement cases


Key takeaways from 2023 enforcement cases

2023 saw multiple important cases of privacy enforcement. These cases are often important to keep track of as they give us insight into the areas that privacy watchdogs focus on, the kind of fines that privacy violations can attract, and changes and improvements expected by the enforcement.


While we prepare for a new year of privacy compliance, let’s take a look at some landmark cases from 2023 with important takeaways to help us, privacy and information governance professionals, make the best decisions for our companies.



Rite Aid | Federal Trade Commission | December 2023

Rite Aid has been banned from using AI facial recognition technology due to their failure to use proper safeguards for the technology.


Summary: After a complaint was filed in federal court, the FTC investigated Rite Aid, an American drugstore chain, and found that between 2012 and 2020, the company used AI-based facial recognition technology to help spot shoplifting and other such activities. However, this technology was deployed without proper safeguarding measures. It was found that the technology led to the false accusation of customers and subjected them to embarrassment and harassment. The company failed to inform customers that such technology was being used. Further, it was found that people of color were inappropriately targeted. Investigation found that - Proper testing, documentation, assessment, and measurement did not occur prior to deploying the technology- Use of low-quality images was not prevented, leading to increased chances of false positive match alerts- Regular monitoring for accuracy was not conducted.


Outcome: Rite Aid was banned for using facial recognition technology. They were asked to implement proper safeguards for biometric security and surveillance systems. Further, they were ordered to - Delete data collected by the facial recognition technology and direct third parties to do the same- Notify consumers when their biometric data is included in any database relating to security. - Provide clear notice to consumers about the use of such technology- Introduce a data protection program for personal data collected


Key Takeaways: AI-based systems should be deployed with proper safeguards, testing, and documentation. Such systems should be regularly monitored to ensure their accuracy. Consumers should be notified when personal data is collected and processed.




TikTok | Irish Data Protection Commission | September 2023

The Irish Data Protection Commission (DPC) imposed a €345 million fine on TikTok for their improper handling of children’s data.


Summary: Following requests from the Dutch and French DPA, the DPC investigated TikTok’s processing of personal data between July 2020 and December 2020. It was found that the default setting of users was ‘public,’ meaning that the content posted by children could reach a larger audience than intended. The DPC found that TikTok failed to implement measures to ensure that only the necessary personal information required for TikTok’s processing was processed. It was also found that the ‘Family Pairing’ feature, introduced to help parents monitor their children’s activity, lacked appropriate protection of personal data as it lacked proper verification measures. The DPC also found a lack of transparency on the part of TikTok as they failed to provide children users with information on the processing of their data and other relevant information in a manner that they could understand.


Outcome: In addition to the fine, the DPC ordered TikTok to bring their processing activities into compliance within 3 months of the notification of the decision. TikTok is to better their age verification procedures, to operate from a Privacy by design perspective, and to provide age-appropriate privacy policies.


Key Takeaways: Privacy by design should be implemented. Profile settings for users, especially children, should be set to ‘private’ by default. Privacy policies for technology and apps used by children should be formatted in a way that can be easily understood by them.




Microsoft Xbox | Federal Trade Commission | June 2023

Microsoft's XBOX Live was found in violation of COPPA for its collection and retention of children's data, for which it was fined $20 million.


Summary: Microsoft's online gaming network, Xbox Live, was found to have violated the Children's Online Privacy Protection Act, COPPA, for their methods of collecting children's personal information and the duration for which it was retained. FTC found that Microsoft violated COPPA in the following ways: -


  • Violation of direct notice: Microsoft first collected data from users below the age of 13, a requirement to set up an account; only then did it inform their parents and ask for consent for the same, violating the mandatory provision of direct notice.

  • Violation of online provision: Microsoft did not provide an appropriate privacy notice meeting the requirements of COPPA. Their privacy notice left out important details about the kind of information collected, its purpose, and most importantly, how parents can request to stop collection and delete collected information.

  • Violation of retention requirements: FTC also found that the company retained children's data for years unnecessarily.

Outcome: In addition to the $20 million fine, Microsoft will also have to change its privacy practices to comply with COPPA. Further, if data is disclosed to video game publishers, it must be mentioned that the data belongs to a child so that the third party may also comply with COPPA.


Key Takeaways: Informed consent from parents must be obtained prior to the collection of any data from users ages 13 and below. Comprehensive privacy notices must be made easily available with provisions to opt-out and delete collected data. COPPA applies to online services in addition to websites and apps. COPPA has a wide definition of personal information, including avatars, biometrics, health data, etc.




Meta | Irish Data Protection Commission | May 2023

The Irish Data Protection Commission (DPC) imposed a €1.2 billion fine on Meta Platforms Ireland for their unlawful transfers of personal data to the U.S


Summary: DPC’s inquiry into Meta, which started in 2020, found data transfers to its counterpart in the US, Meta Platforms, were violating Article 46(1) of the GDPR. It was found that the transfers were made on the basis of the European Commission's 2021 Standard Contractual Clauses (SCCs). The transfer included a Transfer Impact Assessment.


The DPC found that while the transfers took place on the basis of the SCCs as well as additional measures, these procedures were not sufficient to compensate for the inadequate protection provided by US laws. The DPC found the data protection provided by the US to be lacking when compared to the protection provided by the EU law. They also found Meta’s safeguarding measures to be insufficient when US laws came into the picture.


Outcome: In addition to the 1.2 billion fine, Meta was ordered to change their processing operations to comply with Chapter V of the GDPR and to cease unlawful processing and storage of personal data (from EUEEA transfers) in the US. Transfers of personal data to the US were ordered to be suspended.

Key Takeaways: When transferring data across borders, it is essential to take into consideration the privacy laws and data protection offered by the source location as well as the target location. Often, there are multiple differences and nuances between both locations. All data transfers should happen in compliance with the privacy laws of both locations.




Amazon | FTC | May 2023

The FTC and DOJ found Amazon in violation of COPPA for retaining kids' Alexa voice for longer than necessary and for not following through on parents' deletion requests.


Summary: The Department of Justice filed a complaint on behalf of the FTC, where it was found that voice recordings and geolocation data collected by Alexa voice assistant were retained for a number of years and used to improve its algorithm program. Even when parents requested to have that data deleted, Amazon failed to do so effectively. It was found that geolocation was also retained for longer than necessary.


Outcome: In addition to the $25 million civil penalty, Amazon:


  • Is prohibited from using data – geolocation, children's voice data, and other voice data that have been requested to be deleted for the creation or improvement of any product.

  • Should delete inactive Alexa accounts of minors.

  • Users should be informed of their retention policies and practices.

  • Refrain from misrepresenting their privacy practices in their policies.

Key Takeaways: Data should only be retained for the duration of its necessity. Users should be informed of the company's retention and deletion policies. DSA requests should be honored effectively.




BetterHelp | Federal Trade Commission | March 2023

For sharing sensitive mental health data with third parties for targeted advertising purposes, BetterHelp has been fined $7.8 million by the FTC.


Summary:  Online mental health services company BetterHelp, based in California, has been fined by the FTC for sharing the sensitive mental health data of their clients with third parties such as Facebook, Snapchat, Criteo, and Pinterest. Clients were made to fill out a questionnaire providing their personal information, such as their mental health history, experiences of depressive episodes, suicidal thoughts, medications, etc, along with their name, email, and such. Third parties like Facebook were instructed to use such data to identify similar individuals and advertise BetterHelp's counseling services to them. The collection of sensitive data was done without the consent of clients, and privacy misrepresentations were shown to them.

  

Outcome: The proposed order by the FTC required BetterHelp to:


  • Obtain user consent prior to sharing personal information with third parties for any purpose. 

  • Implement a privacy program that protects consumer data. 

  • Have the third parties delete the personal data shared with them. 

  • Implement and follow a retention schedule. 


In addition to the requirements, BetterHelp has been fined $7.8 million, which will be used to provide partial refunds to those users who signed up for the company's services between 1st August 2017 and 31st December 2020. 


Key Takeaways: Express consent must be obtained before sharing of sensitive data with third parties. Privacy policies cannot be misrepresented to users. Limits should be placed on the usage of data by third parties.  




GoodRx | Federal Trade Commission | February 2023

The Federal Trade Commission imposed a fine of $1.5 million on GoodRx for sharing sensitive health information with companies like Facebook and Google and failing to disclose the same to consumers.


Summary: The California-based digital health platform, GoodRx, provides healthcare services and helps consumers find deals on prescription medication. The FTC found that GoodRx violated the FTC Act by


  1. Sharing personal health information with Facebook, Google,

  2. Used personal health information for targeted advertising purposes

  3. Failed to limit third-party use of personal health information

  4. Misrepresented its HIPAA Compliance

  5. Failed to implement policies to protect sensitive personal health information.


Outcome: In addition to the $1.5 million penalty imposed on GoodRx, the FTC ordered that GoodRx:-


  1. Cease the sharing of health data for advertising purposes

  2. Obtain affirmative consent for sharing of health data

  3. Direct third parties to delete the health data that was shared

  4. Practice data retention

  5. Set up a privacy program


Key takeaways: Companies should refrain from sharing sensitive information, such as health data, with third parties. Health data cannot be shared for advertising purposes. Consent should be obtained before sharing other categories of data. 




TikTok | CNIL | January 2023

French data protection authority imposed €5 million on social media platform TikTok for their unlawful practices around cookie collection


Summary: Following investigations carried out by CNIL on TikTok’s website, the French data protection authority found that TikTok UK and TikTok Ireland had violated the French Data Protection Act by


  • Not having an easy and straightforward process to refuse cookies (however, a button to immediately accept cookies was present). Multiple clicks and steps were to be utilized to refuse cookies

  • No sufficient information on the purpose of cookie collection was provided to users

Outcome: Based on the breaches found, the volume of people concerned, and the intimations from the CNIL on this issue, the fine amount of €5 million was imposed on TikTok


Key Takeaways: Users should be able to refuse cookies just as easily as they are able to accept them. Refusal of cookies should be easy and straightforward. Information on the purpose of cookie collection should be explicitly provided.


 

 

Featured Posts

Recent Posts

Follow Us

  • Facebook Basic Square
  • Twitter Basic Square
  • Google+ Basic Square
bottom of page