AG-323

Children's Data Restriction Governance

Privacy, Consent & Data Subject Rights ~14 min read AGS v2.1 · April 2026
EU AI Act GDPR NIST

2. Summary

Children's Data Restriction Governance requires that AI agents apply stricter processing rules, profiling restrictions, and shorter retention periods when processing data of individuals identified or reasonably presumed to be children. The system must implement age detection or verification mechanisms, apply child-specific data protection controls automatically when a child is identified, prohibit profiling of children for marketing or behavioural targeting, restrict data sharing to essential purposes only, and enforce shorter retention periods. This dimension recognises that children merit specific protection (GDPR Recital 38) and that AI agents, which process data at scale and speed, require structural controls to ensure that child-specific protections are consistently applied.

3. Example

Scenario A — Child Profiled for Behavioural Advertising: A customer-facing AI agent for a gaming platform processes user interaction data to build behavioural profiles for targeted advertising. The platform's terms of service require users to be 13 or older, but no effective age verification exists. The agent builds behavioural profiles for 47,000 users whose interaction patterns (game choices, play times, language patterns, content preferences) are consistent with children under 13. The profiles are shared with advertising partners. An investigation by the children's data protection authority (via complaint from a parent) reveals the profiling. Result: USD 5.2 million COPPA fine in the US, GBP 3.1 million ICO fine in the UK under the Age Appropriate Design Code, mandatory deletion of all profiles for under-13 users, and 12-month prohibition on behavioural advertising to all users under 18.

What went wrong: No age verification existed beyond self-declaration. The agent profiled all users identically regardless of likely age. No structural control prevented the profiling of children. No mechanism existed to detect likely child users from interaction patterns.

Scenario B — Child Data Retained Beyond Necessity: A public sector AI agent processes school enrollment data for educational resource allocation. The agent retains individual student records — including learning assessments, behavioural observations, and special educational needs data — for 10 years under a blanket retention policy inherited from the adult citizen data policy. A data subject access request by a now-18-year-old reveals that detailed behavioural observations from age 7 are still retained. Result: ICO finding of excessive retention, mandatory review and deletion of 230,000 child records beyond the appropriate retention period, and formal reprimand.

What went wrong: The child data retention policy was not distinguished from the adult data retention policy. Behavioural observations appropriate to retain for 2 years were retained for 10. No automated enforcement of child-specific retention periods existed.

Scenario C — Child Restrictions Correctly Implemented: An AI agent for a family-oriented streaming service implements age-gated processing. When a user profile is flagged as under-16 (through parental verification at account creation), the agent automatically: (1) disables behavioural profiling for advertising, (2) restricts data collection to service-essential fields only, (3) applies a 30-day retention period for viewing history (vs. 2 years for adults), (4) blocks data sharing with third-party analytics partners, and (5) disables personalised recommendation based on individual behaviour (using only age-appropriate content catalogues). A quarterly audit confirms that 12,400 child profiles are processed under the restricted regime. When a parent requests deletion, the child's data is purged within 48 hours across all systems. Result: ICO Age Appropriate Design Code compliance confirmed. Zero enforcement actions.

4. Requirement Statement

Scope: This dimension applies to all AI agents that process personal data of individuals who are or may be children. "Children" means individuals below the age of digital consent in the applicable jurisdiction: 13 in the US (COPPA), 16 in the UK (UK GDPR), 13-16 across EU member states (GDPR Article 8 — varies by member state), and equivalent thresholds in other jurisdictions. The scope includes agents that do not intentionally serve children but may encounter child data: a general-purpose customer service agent may interact with children; a healthcare agent may process paediatric data; a public sector agent may process educational records. If there is a reasonable likelihood that children's data will be processed, the agent is in scope. Agents that exclusively serve verified adult-only environments (e.g., regulated gambling platforms with mandatory age verification) may document the verification mechanism and claim an exclusion, subject to audit.

4.1. A conforming system MUST implement age detection or age verification that identifies or flags data subjects who are or may be children, using at least one of: verified date of birth, parental account linkage, age estimation from interaction patterns, or regulatory age verification standards.

4.2. A conforming system MUST apply child-specific data protection controls automatically when a data subject is identified or flagged as a child, without requiring manual intervention.

4.3. A conforming system MUST prohibit profiling of children for marketing, behavioural targeting, or commercial personalisation purposes unless explicit parental consent has been obtained under applicable law.

4.4. A conforming system MUST enforce shorter retention periods for children's data than for equivalent adult data, with child-specific retention schedules documented and automatically enforced.

4.5. A conforming system MUST restrict the sharing of children's data with third parties to purposes that are strictly necessary for the service provided to the child, and must not share children's data for advertising or commercial profiling purposes.

4.6. A conforming system MUST obtain verifiable parental consent before processing children's personal data where consent is the lawful basis, using mechanisms that provide reasonable assurance of parental identity (e.g., credit card verification, signed consent form, video verification).

4.7. A conforming system SHOULD implement the "best interests of the child" as a primary consideration in all automated decisions affecting children, per the UN Convention on the Rights of the Child Article 3.

4.8. A conforming system SHOULD provide child-appropriate privacy notices using age-appropriate language when the agent directly interacts with children.

4.9. A conforming system MAY implement age-adaptive processing that adjusts data protection controls dynamically based on the child's age within the child category (e.g., stricter controls for under-10 than for 14-15 year olds).

5. Rationale

Children's data protection is mandated by every major privacy framework with enhanced requirements beyond those for adult data. GDPR Recital 38 states: "Children merit specific protection with regard to their personal data, as they may be less aware of the risks, consequences and safeguards concerned and their rights in relation to the processing of personal data." COPPA imposes strict requirements on the collection of personal information from children under 13. The UK Age Appropriate Design Code (Children's Code) establishes 15 standards for online services likely to be accessed by children.

AI agents amplify the risks to children's data in several ways. First, agents process data at scale and may not distinguish between adult and child interactions without structural controls. A customer-facing agent that serves millions of users will inevitably encounter children, even if the service is nominally for adults. Second, AI profiling is particularly concerning for children because behavioural profiles built during childhood can persist and influence automated decisions throughout the individual's life. Third, AI agents are increasingly deployed in educational and entertainment contexts where children are the primary users — the volume and sensitivity of children's data in these systems is substantial.

The ICO's Age Appropriate Design Code enforcement has demonstrated regulatory willingness to impose significant penalties. TikTok was fined GBP 12.7 million for processing children's data without appropriate safeguards. The FTC has extracted COPPA settlements exceeding USD 200 million from technology companies. These enforcement actions reflect a global consensus that children's data protection failures are treated with particular severity.

For AI agents, the "by default" principle is critical: child protections must be the default that is relaxed for verified adults, not the exception that is applied only when a child is confirmed. This inverted default — restrict first, verify later — is the safest approach for any service that might encounter children.

6. Implementation Guidance

The core architecture for AG-323 is a child identification layer that flags data subjects as potential children, combined with an automatic restriction framework that applies child-specific processing rules.

Recommended patterns:

Anti-patterns to avoid:

Industry Considerations

Education Technology. Children are the primary users. Every interaction generates child data. Profiling for educational purposes may be permitted, but profiling for commercial purposes is prohibited. Retention of educational records must be governed by education-sector retention schedules, which differ from general data protection retention.

Gaming and Entertainment. High probability of child users regardless of age restrictions. Behavioural data (play patterns, in-game purchases, social interactions) is particularly sensitive for children. Loot box and microtransaction data for children faces specific regulatory scrutiny.

Healthcare. Paediatric data has dual protection requirements: health data sensitivity (GDPR Article 9) and child data restrictions (Article 8 / Recital 38). Parental access rights interact with the child's emerging autonomy — a 15-year-old may have confidentiality rights for certain health data (e.g., sexual health) that override parental access.

Maturity Model

Basic Implementation — The organisation has identified which agents may process children's data. Age verification relies on self-declaration with at least one supplementary signal. Child-specific retention policies are defined. Behavioural profiling for marketing is prohibited for identified children. Enforcement is application-layer. This level meets minimum requirements but relies on identification accuracy and application-layer enforcement.

Intermediate Implementation — Age classification uses multiple signals including verified date of birth where available. Child restriction profiles are enforced at the data access gateway. Parental consent verification uses at least one robust mechanism. Child-specific retention is automated. Data sharing restrictions are enforced structurally — agents cannot share child data with prohibited recipients. Cross-border age thresholds are applied per the data subject's jurisdiction.

Advanced Implementation — All intermediate capabilities plus: age-adaptive processing adjusts restrictions by age range. Behavioural age estimation supplements verification for unverified users. Independent testing confirms that child restrictions cannot be bypassed. Real-time dashboards show child data processing metrics (volume, restriction compliance, retention compliance). The system supports the "best interests of the child" assessment as an automated consideration in decision-making. Parental consent refreshes automatically when processing activities change.

7. Evidence Requirements

Required artefacts:

Retention requirements:

Access requirements:

8. Test Specification

Test 8.1: Age Detection Accuracy

Test 8.2: Automatic Restriction Application

Test 8.3: Child-Specific Retention Enforcement

Test 8.4: Data Sharing Restriction

Test 8.5: Parental Consent Verification

Test 8.6: Cross-Border Age Threshold Application

Conformance Scoring

9. Regulatory Mapping

RegulationProvisionRelationship Type
GDPRArticle 8 (Conditions Applicable to Child's Consent)Direct requirement
GDPRRecital 38 (Children's Specific Protection)Direct requirement
COPPA (US)16 CFR Part 312 (Children's Online Privacy Protection Rule)Direct requirement
UK Age Appropriate Design CodeStandards 1-15Direct requirement
EU AI ActArticle 9 (Risk Management — Vulnerable Groups)Supports compliance
CCPA/CPRASection 1798.120(c) (Opt-In for Minors)Direct requirement
LGPD (Brazil)Article 14 (Processing of Children's Data)Direct requirement
UN Convention on the Rights of the ChildArticle 3 (Best Interests) and Article 16 (Privacy)Supports compliance
NIST AI RMFMAP 2.3 (Affected Communities)Supports compliance

GDPR — Article 8 (Conditions Applicable to Child's Consent)

Article 8 requires that where consent is the lawful basis for processing, the consent of a child below the digital consent age is valid only with parental authorisation. AG-323's parental consent verification requirement directly implements this. The age threshold varies by member state (13-16), and AG-323's cross-border threshold application ensures correct implementation across jurisdictions.

UK Age Appropriate Design Code

The Code establishes 15 standards for online services "likely to be accessed by children." Key standards mapped to AG-323 include: Standard 2 (Data Protection Impact Assessment), Standard 3 (Age Appropriate Application), Standard 5 (Detrimental Use — prohibiting profiling for commercial purposes), Standard 8 (Data Minimisation), Standard 9 (Data Sharing), and Standard 12 (Profiling). The Code applies to any service likely to be accessed by children, not only services designed for children. AG-323's "likely to encounter" scope mirrors this broad applicability.

COPPA — 16 CFR Part 312

COPPA requires verifiable parental consent before collecting personal information from children under 13, limits collection to what is reasonably necessary, and requires reasonable data security measures. AG-323's parental consent verification, data minimisation for children, and structural restriction enforcement implement these requirements. The FTC's enforcement record demonstrates that COPPA violations attract substantial penalties — the 2019 YouTube settlement was USD 170 million.

LGPD — Article 14

Brazil's LGPD requires that processing of children's data be carried out in their best interest, with specific and prominent consent from at least one parent or legal guardian. AG-323's best-interests consideration and parental consent requirements align with this provision.

10. Failure Severity

FieldValue
Severity RatingCritical
Blast RadiusAffected child population — with escalation to organisation-wide regulatory action and potential criminal liability

Consequence chain: Children's data protection failures attract the most severe regulatory responses across all jurisdictions. The ICO's GBP 12.7 million TikTok fine and the FTC's USD 520 million Epic Games settlement demonstrate the scale of penalties. Beyond financial penalties, children's data failures create reputational damage that is disproportionately severe — public perception of organisations that fail to protect children is uniquely negative. For AI agents, the risk is compounded by scale: an agent processing 100,000 child interactions per month without appropriate restrictions creates 100,000 violations per month. Profiling of children for commercial purposes without appropriate safeguards may constitute a "prohibited practice" under EU AI Act Article 5 in certain configurations, elevating the regulatory consequence to prohibition of the AI system. In some jurisdictions, wilful collection of children's data without parental consent carries criminal penalties for responsible individuals. The reputational and regulatory consequences of children's data failures are the most severe of any data protection category.

Cross-references: AG-059 (Data Classification & Sensitivity Labelling), AG-060 (Consent & Lawful Basis Verification), AG-061 (Data Subject Rights Execution), AG-063 (Privacy-by-Design Integration), AG-013 (Multi-Jurisdictional Compliance Mapping), AG-319 (Purpose-Consent Granularity Governance), AG-321 (Sensitive Attribute Inference Governance), AG-322 (Data Minimisation by Design Governance), AG-324 (Automated Profiling Notice Governance), AG-326 (Privacy Impact Assessment Trigger Governance).

Cite this protocol
AgentGoverning. (2026). AG-323: Children's Data Restriction Governance. The 783 Protocols of AI Agent Governance, AGS v2.1. agentgoverning.com/protocols/AG-323