top of page

Children’s Privacy: Are we doing enough?

With the influx of technology into our daily lives, it has become difficult to keep track of what data we share and how it is being used. While much is being done for the privacy of individuals and their rights, there is comparatively less oversight and protection for children online.


A report presented at the 47th Session of the Human Rights Council states that children’s use of social media doubles between the ages of nine and 12. As children’s access to social media and technology is increasing rapidly, there is not much regulation and supervision for children’s data and privacy.


There are more challenges in the children’s privacy space due to lack of awareness on part of the children as well as their guardians. The same report states that about 80% of children living in developed Western countries have a digital footprint even before they turn two. Parents start sharing the children’s information online without realising they are leaving a digital footprint. This could start with a simple act of sharing the baby’s in-utero images.


Recently, Ireland's data privacy regulator imposed a record fine of $402 million against Instagram for to its handling of children's data which was not compliant with the parental consent requirement under Article 8 of the GDPR.


Instagram accounts of child users were set to “public” by default, thereby also making their social media content public. Users had to manually change it to “private” through the account privacy settings, which also violated the Data Protection by Design and by Default rule under Art 25 of the GDPR. Though the company plans to appeal the fine, this cannot be considered as a one-off case.


In 2019, Google had to pay a fine of $170 million as part of a settlement with the FTC and New York’s attorney general for alleged violation of the Children's Online Privacy Protection Act (COPPA) Rule. The fine was imposed for YouTube’s illegal collection of personal information from children without proper notice or parental consent.

There is a need for stronger regulation in the children’s privacy area due to the group’s vulnerability to physical, mental and sexual abuse. Children can get exposed to all kinds of content online, which could amplify their struggles like body-image issues, self-harm, suicidal thoughts, etc.


Recently, California lawmakers passed the California Age-Appropriate Design Code Act (Bill 2273) which aims to create a safer digital space for kids under 18. The bill would apply to social apps like Instagram, TikTok, and YouTube as well as any business offering “an online service, product, or feature likely to be accessed by children.”

The act will have a wider scope and higher protection standards compared to the Children’s Online Privacy Protection Act (COPPA), which applies when a business operates a website or online service directed to children, or such a business has actual knowledge that it is collecting or maintaining personal information from a child. Additionally, bill 2273 will provide privacy protections to all users under 18, unlike the COPPA, which applied only to children under age 13.


On the federal level, an updated version of COPPA (referred to as COPPA 2.0) as well as the Kids Online Safety Act (referred to as KOSA) are still in the pipeline.


We need strict legislations that will hold companies responsible for creating a safe platform for kids. Businesses are required to collect verified parental consent as soon as they have actual knowledge of kids using their platform. However, there are several such websites and services which claim they don’t allow kids under a certain age. But kids can possibly find a work around the age restrictions and sign up any way. In such cases, businesses can’t claim innocence and should rather have more stringent restrictions to prevent underage users.


Pending regulations like the KOSA—which require social media platforms to provide minors with options to protect their information, disable addictive product features, and opt out of algorithmic recommendations while also enabling the strongest settings by default —are a step in the right direction for creating a safer online experience for kids.

Featured Posts

Recent Posts

Follow Us

  • Facebook Basic Square
  • Twitter Basic Square
  • Google+ Basic Square
bottom of page