Kids’ Privacy and Cookies: Special Rules

Kids' Privacy and Cookies: Special Rules

The digital landscape for children has fundamentally transformed over the past two decades, creating an urgent need for specialized regulatory frameworks that recognize children’s unique vulnerability to online tracking and data exploitation. In 2025, the regulatory environment governing children’s online privacy, particularly as it relates to cookies and tracking technologies, has undergone its most significant evolution since the original Children’s Online Privacy Protection Act took effect in 2000, with new requirements taking effect on June 23, 2025, establishing stricter controls on data collection, mandatory parental consent for third-party sharing, expanded definitions of personal information including biometric identifiers, and heightened transparency obligations that fundamentally alter how platforms must operate when children access their services. This comprehensive analysis examines the multifaceted legal frameworks, technical implementations, enforcement trends, and practical challenges that define children’s online privacy protection in the context of tracking cookies and cookie control mechanisms across the United States, European Union, United Kingdom, and other jurisdictions.

Is Your Browsing Data Being Tracked?

Check if your email has been exposed to data collectors.

Please enter a valid email address.
Your email is never stored or shared.

Evolution and Current Framework of COPPA and American Children’s Privacy Law

The Children’s Online Privacy Protection Act represents a cornerstone of American children’s privacy law, establishing a comprehensive framework that treats children fundamentally differently from adults in the digital ecosystem. Enacted by Congress in 1998 and initially implemented through an FTC rule that took effect on April 21, 2000, COPPA created the first major federal privacy law specifically designed to protect a defined class of internet users, recognizing that children under thirteen years of age require specialized protections due to their cognitive development, limited ability to understand privacy implications, and vulnerability to manipulation by sophisticated digital technologies. The foundational principle underlying COPPA reverses the typical internet default state: rather than allowing companies to collect and use personal information from anyone unless that individual objects, COPPA presumes that children cannot consent to data collection and requires operators to obtain verifiable parental consent before any collection, use, or disclosure of personal information from children under thirteen.

The first major update to COPPA came in 2013 when the Federal Trade Commission revised the rule to address evolving internet practices and business models that had emerged over the preceding thirteen years. These 2013 amendments introduced more stringent parental notice and consent requirements, expanded the definition of personal information to include persistent identifiers like cookies and IP addresses, and established additional obligations regarding data minimization, security, and deletion. However, since the 2013 revision, technological capabilities and business practices have evolved dramatically, with children’s online experience increasingly mediated by sophisticated tracking technologies, artificial intelligence systems, behavioral profiling algorithms, and data monetization schemes that the original rule could not have contemplated. Social media platforms, streaming services, gaming applications, educational technology providers, and general audience websites have become ubiquitous in children’s lives, and many of these services deploy tracking cookies and other surveillance mechanisms that collect extensive behavioral data about children without meaningful parental oversight.

On January 16, 2025, the Federal Trade Commission unanimously voted to adopt comprehensive amendments to the COPPA rule, representing the first major overhaul in twelve years. The amendments were published in the Federal Register on April 22, 2025, and became effective on June 23, 2025, with compliance required by April 22, 2026, establishing an eighteen-month transition period during which operators must bring their systems and practices into alignment with the updated requirements. These amendments reflect growing regulatory concern about the monetization of children’s data, sophisticated tracking mechanisms deployed against minors, inadequate age verification practices, and the cumulative risks posed by behavioral profiling, algorithmic manipulation, and third-party data sharing. The changes are particularly significant because they represent a fundamental shift in regulatory philosophy: whereas previous COPPA enforcement primarily focused on ensuring that operators obtained parental consent before initial data collection, the 2025 amendments establish affirmative obligations on operators to limit what data they collect in the first place, restrict how they can use that data, minimize retention periods, and obtain separate consent for disclosures to third parties.

Expanded Definition of Personal Information Under Updated COPPA

One of the most consequential changes in the 2025 COPPA amendments involves a substantially expanded definition of what constitutes “personal information” subject to the rule’s protections. Under the updated rule, personal information now explicitly includes biometric identifiers that can be used for automated or semi-automated recognition, encompassing fingerprints, handprints, retina patterns, iris patterns, genetic data including DNA sequences, voiceprints, gait patterns, facial templates, and faceprints. This expansion reflects the increasing integration of biometric technologies into children’s digital experiences, from fingerprint authentication on mobile devices to facial recognition in gaming applications and virtual reality environments. The FTC did narrow its original proposal by declining to include the broader language of “data derived from voice data, gait data, or facial data,” responding to industry concerns about overreach while still addressing the core biometric identifiers that pose meaningful privacy risks to children.

Beyond biometric identifiers, the updated rule clarifies that personal information includes state identification cards, birth certificates, and passport numbers in addition to Social Security numbers, eliminating uncertainty about which government identifiers trigger COPPA compliance requirements. The rule now explicitly addresses mobile telephone numbers as “online contact information” when operators use them to send text messages to parents in connection with obtaining parental consent, reflecting the reality that text messaging has become the primary communication method for many families and represents a more accessible verification mechanism than traditional methods. Most significantly for the context of tracking cookies, the definition continues to include persistent identifiers, which remain among the most commonly used tracking mechanisms deployed across children’s online services, yet the 2025 amendments establish new limitations on how operators can utilize this exception for “support for internal operations.”

Cookie Collection and Tracking in Child-Directed Services

Tracking cookies occupy a peculiar and problematic position within the COPPA framework because they simultaneously serve legitimate operational functions—such as maintaining session state, remembering user preferences, and providing basic analytics to improve service quality—and enable sophisticated surveillance mechanisms that operators use to build behavioral profiles, enable targeted advertising, and monetize children’s attention and data. The original COPPA rule, recognizing this tension, created an exception allowing operators to collect persistent identifiers without obtaining verifiable parental consent if two conditions were met: first, the operator collected no other personal information, and second, the persistent identifier was used solely for “support for the internal operations” of the website or service. This exception made practical sense because it would be nearly impossible to operate an online service without some mechanism to track sessions and maintain basic functional state across page loads.

However, the “internal operations” exception became subject to increasingly broad interpretation and abuse by operators who claimed that nearly any use of persistent identifiers fell within this exception, extending it to cover behavioral advertising, profiling, personalization recommendations, and other uses that clearly exceeded genuine operational necessity. The FTC noted in enforcement actions that the exception was “too broad” and created opportunities for companies to categorize data collection as necessary for internal operations when this was clearly not the case. The 2025 amendments address this problem by maintaining the internal operations exception but adding new transparency requirements: operators relying on this exception must now provide users with a notice specifying the internal operations for which data is collected and disclosing policies and practices to ensure persistent identifiers are not used for unauthorized purposes such as behavioral advertising.

Under COPPA’s 2025 framework, cookies used to track children across multiple websites for targeted advertising purposes require explicit parental consent and cannot be justified under the internal operations exception. When websites deploy third-party cookies from advertising networks, analytics providers, or other external services, those third parties become subject to COPPA’s requirements if they have “actual knowledge” that the data they are collecting comes from children under thirteen. The rule defines “actual knowledge” to include circumstances where a child-directed content provider directly communicates the child-directed nature of its content to the advertising network or where a representative of the ad network recognizes the child-directed nature of the content through reasonable observation. This places significant responsibility on advertising networks and other third-party service providers to implement reasonable filtering mechanisms to identify and exclude children’s data from profiling systems that are designed for adult users.

Parental Consent Mechanisms and Verification Methods

One of the most practically challenging aspects of COPPA compliance involves obtaining verifiable parental consent, a requirement that appears straightforward on its face but becomes extraordinarily complex in implementation, particularly at scale for consumer internet services serving millions of children globally. COPPA does not mandate specific methods for obtaining parental consent; instead, the rule requires operators to choose methods “reasonably designed in light of available technology” to ensure that the person giving consent is actually the child’s parent, not the child themselves or some other unauthorized person. This flexible approach was intended to allow innovation in consent mechanisms while establishing a standard against which the adequacy of any chosen method could be evaluated.

The original COPPA rule identified several acceptable methods for obtaining verifiable parental consent, including signed written consent received via postal mail or facsimile, parental verification through a credit card transaction, parental verification through access to credit card statements, parental verification through other payment systems, and verification through toll-free telephone numbers operated by trained personnel. However, these methods presented significant barriers to parental consent and reduced friction that operators could exploit to discourage consent, particularly the requirement for postal mail or credit card-based verification. Parents found these mechanisms cumbersome and inconvenient, and operators had limited incentive to streamline processes that effectively gatekept children’s access to services.

The 2025 amendments substantially expand acceptable parental consent methods to include knowledge-based authentication through dynamic, multiple-choice questions that are difficult for a child to answer, submission of government-issued photo identification, and text messaging coupled with additional verification steps such as follow-up texts, letters, or phone calls to confirm that the consenting individual is the parent. These new methods represent a significant modernization of the COPPA framework and reflect the FTC’s recognition that more efficient verification mechanisms can simultaneously improve parental involvement while reducing friction that might discourage operators from seeking consent in the first place. Knowledge-based authentication presents particular advantages because it requires operators to maintain background information about parents (such as previous addresses or vehicle information) that the child is unlikely to possess, and the questions can be randomized and dynamically generated to prevent children from attempting to circumvent the system. Text-plus verification methods that combine text messaging with additional confirmation steps leverage the ubiquity of mobile phones among parents while adding additional security through multi-step verification.

Despite these improvements, significant practical challenges remain in implementing effective parental consent mechanisms at scale. Age verification systems that must distinguish between children and adults often rely on self-reported age information, which children can easily falsify by entering a birth date that makes them appear to be adults. Once a child has successfully created an account by misrepresenting their age, platforms must determine how to remediate the situation: whether to delete all data collected while the false account was active, lock the account, allow continued use under COPPA protections, or take some other approach. Consent management platforms that handle the technical implementation of consent collection must maintain detailed logs of verification attempts, failed verifications, and successful consent grants, creating substantial data storage and management obligations. Parents who have previously given consent may revoke it at any time, requiring platforms to delete all data collected after the consent revocation, yet parents may not always inform the platform of age changes or other circumstances that alter COPPA applicability.

Age Verification and the Detection of Child Users

Age verification and age gating represent foundational technical mechanisms for COPPA compliance, yet they remain remarkably imperfect, with child users routinely circumventing age gates through simple falsification of birth dates or through secondary account creation using parental login credentials. COPPA does not mandate that operators implement age verification, recognizing that age verification technologies are not foolproof and that creating additional barriers to service access may have secondary consequences for children’s legitimate use of beneficial online services. However, operators that fail to implement reasonable age verification mechanisms may face allegations that they knowingly allowed children to use services designed for adults without obtaining parental consent, triggering COPPA’s “actual knowledge” standard.

The recent enforcement action against TikTok illuminates the consequences of inadequate age verification. The FTC alleged that TikTok knowingly created accounts for children and collected extensive personal data from millions of children under thirteen without verifiable parental consent, and that TikTok had deliberately constructed “back doors” that allowed users to bypass the age gate altogether by using login credentials from third-party services such as Instagram and Google. By allowing users to authenticate through these external services, TikTok effectively eliminated the age verification hurdle and allowed children to create regular accounts outside of any kid-specific mode that might have provided additional protections. The complaint alleged that this created millions of “unknown user” accounts belonging to children who then gained access to adult content and features of the general TikTok platform while TikTok collected and maintained vast amounts of personal information from these children without parental consent. This case demonstrates that merely implementing an age gate is insufficient; operators must ensure that the age verification mechanism cannot be easily circumvented through alternative authentication pathways or that sufficient alternative verification occurs through secondary means.

Data Minimization, Retention, and Deletion Requirements

Data Minimization, Retention, and Deletion Requirements

The principle of data minimization—the requirement that operators collect only the personal information actually necessary to fulfill the specific purpose for which it is collected—lies at the heart of COPPA and achieves heightened importance in the 2025 amendments. COPPA explicitly prohibits operators from conditioning a child’s participation in any online activity on the collection of more personal information than is reasonably necessary to participate in that activity, a provision that applies to children’s services generally but becomes particularly important in educational contexts. An educational application, for example, may reasonably need to know a child’s grade level to calibrate the difficulty of learning activities, and it might legitimately need to know how a child performed on previous assignments to provide personalized instruction, but there is no reasonable educational necessity for collecting a child’s email address if the child does not need to receive email communications, nor is there legitimate basis for collecting family income information, parents’ email addresses, or other personal information unrelated to the educational service.

The 2025 amendments strengthen data minimization requirements by explicitly prohibiting operators from retaining children’s personal information indefinitely and requiring operators to maintain a written data retention policy that specifies the business need for retaining children’s personal information and provides a timeline for deletion. This shifts the burden substantially, making indefinite retention the exception requiring specific justification rather than the default practice, as has historically been the case across much of the technology industry. Operators must delete children’s personal information using reasonable security measures to prevent unauthorized access or use, a requirement that extends beyond simple deletion from the visible database to include removal from backup systems, archives, and other storage mechanisms.

For cookies specifically, data retention requirements mean that tracking cookies should be deleted when their operational purpose has been fulfilled, not maintained indefinitely on users’ devices or in operator databases. A session cookie that maintains authentication state during a child’s visit to a website should be deleted when the child closes their browser or their session expires, rather than being converted to a persistent cookie. Analytics cookies that record aggregate information about site usage patterns and child-user interactions with specific features should be deleted or aggregated to remove individual identifiers after a reasonable time period necessary to analyze the data and make improvements to the service. Cookies used to maintain user preferences (such as font size or color scheme selections) might reasonably persist across multiple visits if the user has affirmatively selected those preferences and consented to their retention, but the operator must have a documented retention policy and delete such cookies when the user requests deletion or when they are no longer necessary. For a comprehensive overview of COPPA compliance requirements, especially concerning cookies, operators must understand and adhere to specific data handling practices.

Separate Consent for Third-Party Disclosures and Advertising

Prior to the 2025 amendments, COPPA required operators to obtain verifiable parental consent before collecting, using, or disclosing personal information from children, but this consent could be bundled: a single parental authorization could cover initial collection and use of personal information, internal use of that information, and disclosure to third parties in a single transaction. This arrangement created perverse incentives because operators could present parents with an all-or-nothing choice: either authorize all data practices together or deny the child access to the service entirely. Parents often lacked sufficient transparency about exactly which third parties would receive personal information, for what purposes, or how long those third parties would retain the data, particularly when operators disclosed third parties in overly broad or vague categories.

The 2025 amendments fundamentally restructure this arrangement by requiring operators to obtain separate verifiable parental consent specifically for disclosures of children’s personal information to third parties. Operators must now clearly distinguish between two types of consent: first, consent to collect and use children’s information internally for the operation of the service itself, and second, separate consent to disclose that information to third parties for purposes unrelated to providing the core service. The FTC has provided flexibility in the timing and method of obtaining these separate consents, allowing operators to seek both consents simultaneously during initial account creation or to seek third-party consent separately at a later time when a child attempts to use a feature that would trigger third-party data sharing.

Critically, the rule establishes that certain uses of children’s data can never be considered “integral to the service” and therefore always require separate parental consent for third-party disclosure: these include disclosure for the purpose of receiving monetary compensation, advertising, or developing or training artificial intelligence technology. This codification prevents operators from claiming that selling children’s data to advertisers or sharing behavioral profiles with AI training systems constitutes an integral part of providing the service. An advertising-supported children’s content platform, for example, cannot point to its business model as justification for pre-emptively disclosing children’s data to advertisers; instead, it must obtain separate parental consent for advertising-related disclosures and must clearly explain to parents what categories of advertisers will receive what types of data.

Tracking Cookies and Behavioral Advertising Restrictions

The prohibition on targeted advertising to children under thirteen in COPPA operates through a complex interplay of consent requirements, data minimization principles, and restrictions on profiling practices that have been substantially strengthened through the 2025 amendments. Targeted advertising depends entirely on the availability of behavioral data and profiling systems that can correlate user interests, previous purchases, browsing history, and other behavioral signals to match children with advertisements predicted to interest them. Cookies provide the technical mechanism that enables this behavioral profiling by allowing advertisers to track which websites a child visits, which products they view, how long they spend viewing content, and what they ultimately purchase, aggregating this information into behavioral profiles.

The FTC has proposed and operators should expect will implement restrictions that prohibit targeted advertising from being enabled by default. The proposed amendments would have required targeted advertising to be off by default, with separate parental consent required before a child could be targeted with behavioral advertising. Although this specific proposal did not make it into the final rule, the FTC stated in comments that it remains “deeply concerned about the use of push notifications and other engagement techniques that are designed to prolong children’s time online” and indicated that it may pursue enforcement actions under Section 5 of the FTC Act to address practices encouraging prolonged use. Operators should interpret these statements as clear regulatory guidance that behavioral targeting of children is disfavored even when parental consent might technically permit it, and that practices designed to increase engagement through manipulative techniques may face enforcement scrutiny regardless of consent.

The practical effect of the updated COPPA rule is that third-party cookies deployed specifically for behavioral advertising to children cannot be justified, as advertising networks cannot obtain separate parental consent at the point when third-party cookies are being set on children’s browsers. Most parents do not interact directly with third-party advertising networks; rather, they provide consent to the primary operator of the child-directed service, which then decides whether to engage advertising networks. The primary operator must therefore obtain separate consent from each parent specifically for sharing the child’s data with named categories of third-party advertisers and must be able to identify to parents exactly which data will be shared, how it will be used, and what transparency or deletion mechanisms the advertiser will provide.

International Frameworks: GDPR, GDPR-K, and the UK Children’s Code

The European Union’s General Data Protection Regulation, which took effect on May 25, 2018, established the strictest comprehensive privacy framework globally and includes specific protections for children that vary by member state. Article 8 of the GDPR establishes that when processing personal data of children below the age of sixteen, operators can only rely on parental or guardianship consent as a legal basis for processing, yet individual EU member states may lower this age of consent to as low as thirteen. This creates a patchwork of age thresholds across the EU, with some countries setting the digital age of consent at thirteen, others at fourteen, fifteen, or sixteen, making truly harmonized compliance across Europe impossible without implementing geo-targeted consent flows that apply different rules based on the user’s location.

The GDPR applies to websites and online services that process personal data of children located in the EU, regardless of whether the operator is EU-based or operates from outside the EU. Unlike COPPA, which focuses specifically on children under thirteen and does not protect teenagers, the GDPR applies to all children and teenagers up to the age of eighteen, with heightened protections for those below the relevant member state’s digital age of consent. The GDPR requires explicit or opt-in consent for non-essential cookies, meaning users must actively click an “Accept” or “Allow” button to enable cookies, and pre-checked boxes or implied consent are not valid forms of consent. Additionally, users must be able to separately accept or reject different categories of cookies—such as analytics versus advertising—rather than accepting all cookies in an all-or-nothing transaction.

For children specifically, GDPR requires more granular consent management, often involving parental portals or dashboards through which parents can exercise control over their child’s data rights, including reviewing collected information, requesting corrections, and exercising the right to be forgotten by requesting deletion of all personal information. The GDPR’s requirements for transparency are particularly stringent when the data subject is a child, requiring that privacy notices use simple, clear language that a child can actually understand, rather than the opaque legal language typical of many privacy policies.

Is Your Browsing Data Being Tracked?

Check if your email has been exposed to data collectors.

Please enter a valid email address.
Your email is never stored or shared

The United Kingdom’s Age-Appropriate Design Code, implemented by the Information Commissioner’s Office, represents the world’s first statutory code of practice focused specifically on children’s digital privacy and establishes fifteen flexible standards that online services must follow if they are likely to be accessed by children. The code applies to “information society services” likely to be accessed by children under eighteen, including apps, games, connected toys, streaming services, social media platforms, and news websites, whether or not they are specifically marketed to children. The code establishes that the best interests of the child must be a primary consideration when designing and developing online services, incorporating the United Nations Convention on the Rights of the Child principles into data protection law.

The UK Children’s Code requires that privacy settings be “high privacy” by default unless a compelling reason for an alternative default exists, that only the minimum amount of personal data should be collected and retained, that children’s data should not usually be shared with third parties, and that geolocation services should be switched off by default. Significantly, the code restricts “nudge techniques”—manipulative design features such as dark patterns or strategically positioned buttons—that encourage children to provide unnecessary personal data, weaken privacy settings, or engage in behaviors contrary to the child’s interests. The code explicitly addresses profiling of children, stating that behavioral profiling for advertising purposes should not occur and that children should be protected from personalized pricing or manipulative content recommendations.

Safe Harbor Programs and Industry Self-Regulation

COPPA includes a provision enabling industry groups and self-regulatory organizations to develop their own guidelines implementing COPPA’s protections and to seek FTC approval for these guidelines as “safe harbor” programs. Approved Safe Harbor organizations essentially become co-regulators, establishing industry standards that must meet or exceed COPPA’s requirements and conducting compliance audits of member operators. This cooperative regulatory model has functioned since COPPA’s inception and currently includes organizations such as PRIVO, the Children’s Advertising Review Unit (CARU) operated by Better Business Bureaus, and other industry groups that have developed and maintained FTC-approved programs.

Operators that join approved Safe Harbor programs commit to compliance with the program’s guidelines and submit to regular audits and monitoring. In exchange, they can demonstrate to regulators and to parents that they have voluntarily adopted additional protections beyond the statutory minimum. The FTC’s 2025 amendments significantly increased transparency and oversight requirements for Safe Harbor programs, reflecting regulatory concerns that some programs had become insufficiently rigorous in their monitoring and enforcement. Programs must now publicly identify their participating operators and certified services on their websites, updated every six months, must provide enhanced annual reporting to the FTC including detailed explanations of compliance monitoring and enforcement mechanisms, and must submit technical reports every three years detailing their technological capabilities for assessing compliance.

By October 22, 2025, all approved Safe Harbor programs must submit revised guidelines to the FTC that reflect the updated COPPA requirements, with failure to meet this deadline potentially resulting in suspension or revocation of the program’s approved status. This represents a substantial new regulatory burden on Safe Harbor organizations and signals that the FTC intends to closely monitor industry self-regulation to ensure it remains genuinely protective rather than merely providing a veneer of compliance while allowing questionable practices to continue.

Recent Enforcement Actions and Their Implications

Recent Enforcement Actions and Their Implications

The Federal Trade Commission has pursued increasingly aggressive enforcement of COPPA through major actions against prominent technology platforms, establishing new benchmarks for civil penalties and signaling the regulatory intensity that companies operating in children’s spaces can expect. In 2019, Google and YouTube settled COPPA allegations with a combined $170 million in civil penalties—then the largest COPPA fine in history—for allegedly collecting personal information from children without verifiable parental consent and for failing to delete children’s data upon parental request. YouTube’s default public settings meant that other users could see a child’s personal information, including profile bio, username, picture, and videos, creating privacy and safety risks.

In 2023, Epic Games settled COPPA allegations with a combined $275 million in civil penalties from the FTC and various states, the largest financial penalty ever imposed for a COPPA violation. The case involved Epic Games’ Fortnite gaming platform, which allegedly allowed children to create accounts and played online multiplayer games without verifiable parental consent, collected biometric data through voice chat features without consent, and sold virtual currency through in-app purchase mechanisms that allegedly exploited children’s vulnerabilities to gambling-like mechanics.

Most recently, the Department of Justice and Federal Trade Commission sued TikTok in August 2024, alleging “massive scale invasions of children’s privacy” affecting millions of children under thirteen through practices including knowingly creating accounts for children without parental consent, collecting extensive personal data from child users, failing to honor parents’ requests to delete children’s accounts and data, and maintaining personal information about children known to be under thirteen. The complaint alleged that TikTok violated both COPPA and a prior consent order from 2019 settlement of earlier COPPA violations, suggesting willful disregard for legal obligations to protect children’s privacy. The case has not yet been resolved but is expected to result in a settlement exceeding previous records for COPPA penalties if the government prevails.

These enforcement actions establish clear patterns: companies cannot avoid COPPA compliance by claiming they did not specifically know individual users were children if their platforms knowingly host child-directed content or maintain obvious child-directed features; operators cannot rely on parents to enforce their own privacy rights through opt-out mechanisms or deletion requests if the operator makes these processes difficult or non-responsive; and the FTC expects operators to maintain detailed records of compliance efforts and be able to demonstrate good faith implementation of safeguards. The escalating penalties suggest that companies view COPPA violations as acceptable business costs must expect substantially higher financial consequences moving forward.

State-Level Privacy Laws and Enhanced Protections for Minors

Beyond the federal COPPA framework, multiple American states have enacted or are enacting comprehensive privacy laws that include specific protections for minors, creating a complex patchwork of overlapping jurisdictional requirements that operators must navigate. California’s Consumer Privacy Act, enhanced through the California Privacy Rights Act, includes special protections for consumers under sixteen, requiring operators to obtain opt-in consent from consumers under sixteen before selling their personal information and requiring parental consent for consumers under thirteen. Unlike COPPA’s detailed prescriptive requirements, the CCPA/CPRA establishes a baseline of rights—right to know, right to delete, right to opt out of sales and targeted advertising—that operators must honor, but provides flexibility in how operators implement these rights.

Connecticut’s Comprehensive Privacy Act, which went into effect on January 1, 2025, includes obligations for data controllers that provide online services, products, or features to minors, requiring that such operators provide notice to parents about data practices and obtain affirmative authorization before disclosing personal information of minors to third parties. Colorado’s Privacy Act, Delaware’s Personal Data Privacy Act, and numerous other states have enacted similar frameworks with varying specific requirements but consistent themes: special protections for minors, restrictions on targeted advertising, and transparency requirements that exceed federal COPPA baseline requirements.

These state-level frameworks create a peculiar problem for national and international platforms: operating in compliance with the most protective state’s requirements becomes increasingly important because those requirements effectively establish the operational standard for the entire platform rather than allowing different practices in different states. A platform that operated at COPPA baseline in states without enhanced minor protections but attempted to implement a different, more permissive system in other states would face severe compliance and operational challenges, along with the regulatory risk that state attorneys general would view differentiated treatment as evidence that stronger protections are feasible and thus expected in all states.

Consent Management Platforms and Technical Implementation

The practical implementation of COPPA compliance and cookie control for children requires sophisticated consent management platforms that can distinguish between child users and adult users, apply age-appropriate consent flows, block third-party trackers until valid consent is obtained, maintain detailed audit logs of all consent interactions, and integrate with privacy-related backend systems like data deletion infrastructure. Consent management platforms designed specifically for children’s services must implement geolocation-based routing to apply different consent frameworks based on the user’s location—for example, showing stricter GDPR-K compliant flows to European users while displaying COPPA-compliant flows to U.S. users.

Such platforms must support multiple verification methods for parental consent, including knowledge-based authentication questions that vary by parent and are difficult for children to guess, government ID verification systems that integrate with ID validation services, and text-plus verification flows that send text messages to phone numbers registered with the platform. The CMP must log every consent interaction, including the specific questions asked and answers provided in knowledge-based authentication, the timestamp of consent, the method of consent, what specifically the parent consented to, and any subsequent consent revocation or modification.

From a cookie control perspective, CMPs must implement “auto-blocking” functionality that prevents third-party tracking scripts and cookies from loading until valid parental consent has been obtained. This requires the CMP to parse the website code, identify all third-party scripts and cookies, and block them at the browser level or application level until consent is granted. Subsequently, when the parent provides consent, the CMP must trigger the loading of previously blocked scripts and set cookies with appropriate parameters conveying the consent grant to third parties.

Many CMPs now integrate with industry frameworks such as the Google Consent Mode and the IAB Transparency and Consent Framework to signal consent status to advertising networks and analytics providers that are designed to honor consent signals from these frameworks. However, for children’s services, the integration is more complex because the CMP must ensure that these signals accurately reflect the specific consent granted by the parent—for example, signaling that analytics cookies are consented while advertising cookies are not consented—and must handle the special case where children are detected as users and no third-party trackers can be enabled without separate parental consent.

Biometric Data, Voice Data, and Emerging Privacy Threats

The 2025 COPPA amendments’ inclusion of biometric identifiers in the definition of personal information reflects growing regulatory concern about emerging technologies that enable identification and profiling of children through biological and behavioral characteristics. Facial recognition technology, which has become increasingly accessible and is now deployed in educational settings, gaming applications, and other children’s services, enables identification and tracking of individual children without requiring them to provide identifying information voluntarily. Children using video conferencing applications for remote learning, participating in multiplayer gaming with video features, or using facial recognition to unlock devices may inadvertently have their biometric data collected and processed by service operators in ways that create persistent, durable identifiers potentially more resistant to deletion than traditional user accounts.

Voice data presents a particularly complex category of biometric information in the COPPA context because voice interaction has become increasingly central to children’s digital experiences through voice-activated devices, voice chat in gaming, and voice commands for device control. The FTC has established an enforcement policy statement that when an operator collects an audio file containing a child’s voice solely as a replacement for written words to perform a search or fulfill a verbal instruction, and maintains the file only for the brief time necessary to complete that function, the FTC will not pursue enforcement action for failing to obtain parental consent. This policy recognizes that requiring parental consent every time a child speaks a command to a device would be impractical, but the operator must provide clear notice of its collection, use, and deletion policy regarding these audio files.

Geolocation data, including precise GPS coordinates and cellular tower triangulation, represents another particularly sensitive category of personal information for children because such data reveals children’s physical location, movement patterns, and social networks in ways that create safety risks. COPPA defines geolocation information “sufficient to identify street name and name of city or town” as personal information requiring parental consent, a definition that extends to coordinate-based location data even if not presented as a street address and that includes wireless network information or other data that can be used to infer precise location. The FTC has enforced this provision against operators who claimed they were collecting only approximate location information but whose systems actually collected and maintained precise location data enabling identification of specific buildings or residences children visited.

Balancing Privacy Protection with Beneficial Services

A persistent tension in children’s online privacy regulation involves balancing the legitimate desire to protect children from surveillance and exploitation against the reality that many valuable online services—particularly educational technology, assistive technologies for children with disabilities, and safety-focused parental monitoring applications—depend on the collection and processing of some personal information about children. Over-restrictive privacy rules that effectively prohibit all data collection from children could eliminate beneficial services and deny children and families the advantages that appropriately-designed online services provide.

The FTC has attempted to address this tension through nuanced guidance that distinguishes between different contexts and purposes for data collection. The agency issued a policy statement on educational technology and COPPA clarifying that school officials can provide consent to data collection from students in educational settings in circumstances where the data is used only for educational purposes and is not used for commercial purposes, marketing, or AI training. This exception recognizes that schools have legitimate educational purposes for collecting data about student learning and progress and that requiring individual parental consent for every educational data collection would create practical barriers to effective pedagogy and technology-enhanced learning.

Similarly, the FTC has recognized that parental monitoring services—which allow parents to monitor their children’s digital activities, location, and social interactions—may require collection of information from the monitored child’s device, but has established that such monitoring tools must implement strong security protections, transparent disclosures of monitoring, and safeguards preventing unauthorized access or repurposing of the collected data. The COPPA framework can accommodate these beneficial uses while still protecting children from commercial exploitation of their data.

Securing Young Digital Futures

The landscape of children’s online privacy protection in relation to cookies and tracking technologies has undergone fundamental transformation through the 2025 COPPA amendments and evolving international frameworks including GDPR-K and the UK Children’s Code, establishing that meaningful protection of children from surveillance and data exploitation is both legally mandated and practically implementable. The regulatory framework now clearly establishes that operators cannot rely on parental consent as justification for unlimited data collection and monetization of children’s personal information; instead, operators must adopt privacy-by-design principles that minimize collection in the first place, restrict uses to essential purposes, delete data when no longer necessary, and obtain separate specific parental consent for disclosures to third parties.

Yet significant challenges remain in achieving comprehensive protection for children’s online privacy in an ecosystem where technological sophistication continuously outpaces regulatory frameworks. Age verification and age gating mechanisms remain imperfect, enabling children to circumvent restrictions designed to protect them, while parental consent mechanisms struggle to achieve meaningful parental understanding and engagement with complex data practices. The proliferation of Internet-of-Things devices—smart watches, voice-activated toys, connected gaming systems—creates new vectors for children’s data collection that fall into regulatory gray areas where it remains unclear whether they constitute “websites or online services” subject to COPPA.

Cross-border data flows complicate compliance when children’s personal information collected by U.S. operators is transferred to third parties in other jurisdictions with different privacy standards, and when international platforms must simultaneously comply with COPPA, GDPR-K, and emerging privacy frameworks across numerous countries with inconsistent requirements. The development of artificial intelligence systems trained on children’s personal information creates risks that current regulatory frameworks may not adequately address, particularly regarding whether children’s training data was collected with informed parental consent for such purposes.

Despite these challenges, the trajectory of regulation is clear: regulators globally expect operators to place children’s privacy interests at the center of their design and business model decisions, to implement robust technical protections against surveillance and data exploitation, to maintain detailed compliance documentation, and to expect that violations will result in substantial financial penalties and injunctive remedies. The monetization of children’s attention and data through behavioral advertising, profiling, and third-party data sharing has become indefensible in the regulatory environment of 2025, and operators that continue to normalize such practices should expect enforcement action and market pressure from privacy-conscious parents and families demanding better protection for their children.

Protect Your Digital Life with Activate Security

Get 14 powerful security tools in one comprehensive suite. VPN, antivirus, password manager, dark web monitoring, and more.

Get Protected Now