Erasure and Rectification Propagation Governance requires that when a data subject exercises the right to erasure ("right to be forgotten") or the right to rectification, the request is propagated to every system, data store, cache, log, model artefact, and agent context where the data subject's personal data exists — and that the erasure or rectification is verified as complete across all locations within the regulatory deadline. The dimension addresses a fundamental architectural challenge created by AI agents: personal data does not stay in one place. An AI agent retrieves data from a primary database, incorporates it into a context window, persists it to a conversation log, copies it to a vector store for retrieval-augmented generation, and may have used it as training data for fine-tuning. A rectification in the primary database that does not propagate to conversation logs, vector embeddings, and training data leaves the data subject's incorrect data persisting in systems that will continue to serve it. AG-061 requires that erasure and rectification are treated as distributed operations with verified completion across all data locations, not as single-system updates that leave copies and derivatives scattered across the agent ecosystem.
Scenario A — Erasure Request Does Not Reach Agent Conversation Logs: A customer of a financial services firm submits a GDPR Article 17 erasure request. The data protection team processes the request by deleting the customer's records from the CRM, the transaction database, and the marketing platform. However, the firm's AI customer service agent has accumulated 14 months of conversation logs containing the customer's name, account number, transaction history, and financial concerns. The conversation logs are stored in a separate logging infrastructure managed by the engineering team, not the data protection team. The erasure is reported as complete to the customer within the 30-day deadline. Eight months later, a data subject access request from the same individual reveals that the conversation logs still contain their personal data. The customer complains to the ICO.
What went wrong: The erasure process did not include a comprehensive data inventory covering all agent-specific data stores. The conversation logging infrastructure was outside the scope of the data protection team's erasure process. No verification mechanism confirmed that erasure was complete across all systems. Consequence: ICO enforcement action for incomplete erasure, personal data retained for 8 months after erasure confirmation, loss of customer trust, remediation cost of £210,000 to implement comprehensive data inventory and propagation mechanism, and requirement to re-contact the data subject to confirm re-erasure.
Scenario B — Rectification Not Propagated to Vector Store: A patient at a hospital corrects their medical record: their blood type was recorded as A+ when it is actually O-. The rectification is applied to the electronic health record (EHR) system. However, the hospital's AI clinical decision support agent uses a retrieval-augmented generation (RAG) architecture that retrieves relevant patient data from a vector store built from periodic EHR exports. The vector store was last refreshed 3 weeks before the rectification. For the next 5 weeks — until the next scheduled vector store refresh — the AI agent provides clinical decision support based on the incorrect blood type. During this period, the agent recommends a blood transfusion protocol that would be appropriate for A+ but is contraindicated for O-. A clinician catches the error before it causes harm, but the near-miss triggers a patient safety investigation.
What went wrong: Rectification in the primary system did not trigger an immediate update to the derived data store (vector store) used by the AI agent. The vector store refresh was scheduled rather than event-driven. No mechanism verified that rectifications in the primary system had propagated to all derived stores. Consequence: Near-miss patient safety incident, patient safety investigation, 6-month remediation programme to implement event-driven vector store updates, regulatory notification to the Care Quality Commission, and loss of clinical staff confidence in the AI system.
Scenario C — Erasure Impossible in Fine-Tuned Model: An AI agent for a recruitment platform has been fine-tuned on 850,000 candidate profiles, including names, work histories, educational backgrounds, and demographic information. A candidate exercises their right to erasure under GDPR Article 17. The data protection team deletes the candidate's profile from the recruitment database and application tracking system. However, the candidate's data was included in the fine-tuning dataset, and their information may be encoded in the model weights. The firm investigates and determines that removing a single individual's data from the model would require complete retraining at a cost of £1.8 million and 6 weeks of compute time. The firm has received 127 erasure requests in the past 12 months, each of which would technically require retraining.
What went wrong: The firm fine-tuned on personal data without considering the erasure obligation. No anonymisation was applied before fine-tuning (intersects with AG-060). No architecture was in place for efficient individual data removal (e.g., machine unlearning techniques). The firm faces an irreconcilable conflict between the right to erasure and the practical impossibility of erasing data from trained model weights. Consequence: Regulatory complaint for failure to complete erasure, legal advice that the model may need to be retrained or withdrawn, €1.2 million remediation cost, 9-month programme to implement anonymisation pipeline and explore machine unlearning, and precautionary decision to suspend the fine-tuned model pending resolution.
Scope: This dimension applies to all AI agents that process personal data, where "process" includes any operation from GDPR Article 4(2): collection, recording, organisation, structuring, storage, adaptation, alteration, retrieval, consultation, use, disclosure, combination, restriction, erasure, or destruction. The scope covers every location where an agent's processing may have caused personal data to exist: primary databases, conversation logs, context caches, session stores, vector stores, RAG knowledge bases, fine-tuning datasets, model checkpoints, embedding indices, search indices, backup systems, replicated databases, CDN caches, and any downstream system to which the agent has forwarded or disclosed personal data. The scope extends to inferred and derived data: if the agent generated an inference about a data subject (e.g., "Customer X is a high credit risk"), that inference is personal data about Customer X and is within scope for erasure and rectification. The scope also covers data held by processors and sub-processors to whom the agent has transmitted personal data.
4.1. A conforming system MUST maintain a comprehensive data inventory mapping every location where each data subject's personal data may exist as a result of agent processing, including primary stores, derived stores, logs, caches, model artefacts, and downstream recipients.
4.2. A conforming system MUST propagate erasure requests to every location identified in the data inventory within the regulatory deadline (30 days under GDPR, with provision for extension in complex cases).
4.3. A conforming system MUST propagate rectification requests to every location identified in the data inventory, ensuring that all copies and derivatives of the data subject's personal data reflect the corrected information.
4.4. A conforming system MUST verify that erasure and rectification have been completed across all locations, with a verification record documenting the completion status for each location.
4.5. A conforming system MUST ensure that erasure prevents future retrieval — deleted data must not be recoverable from backups, caches, or replicas through normal system operation after the erasure verification is complete.
4.6. A conforming system MUST address personal data in model artefacts (fine-tuned models, embeddings, vector stores) either through pre-ingestion anonymisation (preventing the problem) or through verified post-hoc removal techniques (machine unlearning, model retraining, or index rebuilding).
4.7. A conforming system SHOULD implement erasure and rectification as event-driven operations that trigger automatically upon receipt of a valid request, propagating to all registered data locations in parallel.
4.8. A conforming system SHOULD maintain a propagation status dashboard showing, for each active erasure or rectification request: the locations identified, the locations completed, the locations pending, and any locations where propagation has failed.
4.9. A conforming system SHOULD implement rectification propagation with latency targets appropriate to the risk context — clinical systems within 1 hour, financial systems within 4 hours, and general systems within 24 hours.
4.10. A conforming system MAY implement predictive data lineage that automatically identifies new data locations when agents are deployed or updated, ensuring the data inventory remains current without manual maintenance.
Erasure and Rectification Propagation Governance implements the data subject rights that are central to every modern data protection framework: the right to erasure (GDPR Article 17, CCPA Section 1798.105, LGPD Article 18(VI)) and the right to rectification (GDPR Article 16, LGPD Article 18(III)). These are not optional features — they are legal obligations that apply to every controller of personal data.
AI agents make these rights technically challenging to implement because they create a data proliferation effect. Traditional software systems have relatively predictable data flows: data enters through defined interfaces, is stored in known databases, and is accessible through controlled queries. AI agents create additional data locations that may not appear in a traditional data map: conversation logs that capture personal data verbatim, vector stores that encode personal data as embeddings, context caches that persist data between sessions, RAG knowledge bases that index personal data for retrieval, and fine-tuned models where personal data may be encoded in weights.
The rectification challenge is particularly acute. Rectifying a record in a primary database is straightforward. Rectifying the same data in a vector embedding is not — the embedding must be regenerated from the corrected source data. Rectifying data in a fine-tuned model may be practically impossible without retraining. Rectifying data in a conversation log may conflict with log integrity requirements. Each of these challenges has architectural solutions, but they must be designed into the system from the start. Retrofitting erasure and rectification propagation to a system that was not designed for it is expensive, disruptive, and often incomplete.
The propagation requirement — ensuring that erasure and rectification reach all data locations — distinguishes AG-061 from simple data deletion. Many organisations can delete a record from a primary database in minutes. The challenge is ensuring that the deletion reaches every derived copy, cached version, logged instance, and embedded representation. Without propagation verification, the organisation believes it has complied with the erasure request while copies of the data continue to exist and be served to users.
AG-061 establishes the data lineage map as the central governance artefact. The lineage map traces each data subject's personal data from the point of collection through every transformation, copy, derivation, and storage location created by agent processing. The propagation engine uses the lineage map to identify all locations that must receive erasure or rectification requests.
Recommended patterns:
Anti-patterns to avoid:
Financial Services. Financial services firms face a tension between erasure obligations and regulatory retention requirements. Anti-money laundering regulations (4AMLD, 5AMLD, 6AMLD, Bank Secrecy Act) require retention of transaction records for 5-10 years. GDPR Article 17(3)(b) exempts erasure where processing is necessary for compliance with a legal obligation. The implementation must distinguish between data categories: transaction records subject to AML retention are exempt from erasure, but marketing data, conversation logs, and agent-generated inferences are not. The system must apply the exemption granularly, not use it as a blanket justification to retain all data.
Healthcare. Medical records have long mandatory retention periods (typically 8-30 years depending on jurisdiction and record type). Rectification is particularly critical in healthcare because incorrect data can cause patient harm. AG-061's rectification propagation must be implemented with clinical-grade latency targets: a rectification to a blood type record that takes 5 weeks to propagate to the AI clinical decision support system is a patient safety risk. Healthcare implementations should target sub-1-hour propagation for safety-critical data elements.
Public Sector. Public sector organisations face additional constraints from public records and archival requirements. Records that are designated as public archives may be exempt from erasure under GDPR Article 17(3)(d), but the exemption applies only where processing is necessary for archiving purposes in the public interest. Agent conversation logs are unlikely to qualify as public archives unless they have been specifically designated. The public sector must also consider the intersection with freedom of information: erased records that were previously disclosed under FOI cannot be "un-disclosed."
Basic Implementation — The organisation has a documented process for handling erasure and rectification requests. The process includes a manual data inventory identifying primary databases and major derived stores. Erasure is performed manually in each identified system, with the data protection team coordinating by email or ticket. Verification is by self-attestation from each system owner. Conversation logs and vector stores may not be included in the inventory. Average completion time: 15-25 business days. This level meets minimum procedural requirements but is error-prone: manual coordination means systems are missed, verification is inconsistent, and the process does not scale beyond low request volumes (under 20 per month).
Intermediate Implementation — Erasure and rectification are implemented as automated event-driven operations. A data lineage registry tracks all locations where personal data exists. The propagation engine dispatches requests to all registered locations in parallel and tracks completion. Verification is automated: each data store confirms deletion and the propagation engine assembles a completion certificate. Conversation logs, vector stores, and context caches are included in the registry. Rectification propagates to all locations within 24 hours. Average completion time: 2-5 business days. The system handles 100+ requests per month without manual coordination.
Advanced Implementation — All intermediate capabilities plus: real-time data lineage tracking that automatically registers new data locations when agents create them; rectification propagation with risk-based latency targets (1 hour for safety-critical, 4 hours for financial, 24 hours for general); cryptographic verification of erasure completion; machine unlearning or blocklist mechanisms for fine-tuned models; automated exemption assessment that applies regulatory retention exemptions granularly; and independent annual audit of propagation completeness verifying that no data locations are missed. The organisation can demonstrate, for any historical erasure request, the complete chain of evidence from request receipt to verified completion across all data locations.
Required artefacts:
Retention requirements:
Access requirements:
Testing AG-061 compliance requires verifying that erasure and rectification propagate completely across all data locations where personal data exists.
Test 8.1: Erasure Propagation Completeness
Test 8.2: Rectification Propagation Completeness
Test 8.3: Conversation Log Erasure
Test 8.4: Vector Store and Embedding Erasure
Test 8.5: Propagation Failure Handling
Test 8.6: Data Inventory Completeness
Test 8.7: Regulatory Exemption Application
| Regulation | Provision | Relationship Type |
|---|---|---|
| GDPR | Article 16 (Right to Rectification) | Direct requirement |
| GDPR | Article 17 (Right to Erasure) | Direct requirement |
| GDPR | Article 19 (Notification Obligation Regarding Rectification or Erasure) | Direct requirement |
| UK GDPR | Articles 16, 17, 19 (as retained) | Direct requirement |
| EU AI Act | Article 10 (Data and Data Governance) | Supports compliance |
| CCPA/CPRA | Section 1798.105 (Right to Delete), Section 1798.106 (Right to Correct) | Direct requirement |
| LGPD (Brazil) | Article 18(IV) (Anonymisation, Blocking, or Deletion), Article 18(III) (Correction) | Direct requirement |
| POPIA (South Africa) | Section 24 (Correction of Personal Information) | Direct requirement |
| HIPAA | 45 CFR 164.526 (Right to Amend) | Supports compliance |
| ISO 42001 | Clause 6.1 (Actions to Address Risks) | Supports compliance |
Article 17 establishes the right to erasure ("right to be forgotten"), requiring controllers to erase personal data without undue delay when certain conditions are met (consent withdrawn, data no longer necessary, unlawful processing, etc.). For AI agents, the obligation extends to all locations where the agent's processing has caused personal data to exist. The CJEU's Google Spain ruling (C-131/12) established that the right to erasure applies even to secondary processing (search indexing); by extension, it applies to AI agent conversation logs, vector stores, and other derived data locations. AG-061 directly implements the propagation requirement implicit in Article 17: erasure must reach every location where the data exists, not only the primary database.
Article 19 requires controllers to communicate any rectification or erasure to each recipient to whom the personal data has been disclosed, unless this proves impossible or involves disproportionate effort. For AI agents that forward data to downstream systems or share data with other agents, this means propagating the erasure or rectification notification to each recipient. AG-061's propagation engine implements this requirement by dispatching requests to all downstream recipients identified in the data lineage map.
CCPA Section 1798.105 establishes the consumer's right to request deletion of personal information. The CPRA added Section 1798.106 establishing the right to correct inaccurate personal information. Under CCPA, a business must direct any service providers and contractors that have received the consumer's personal information to delete it as well. AG-061 implements this by propagating deletion requests to all data locations, including those managed by processors and sub-processors.
HIPAA's right to amend (45 CFR 164.526) requires covered entities to amend protected health information when requested by the individual, with certain exceptions. Unlike GDPR's rectification right, HIPAA allows the covered entity to deny the amendment if it determines the information is already correct — but if the amendment is accepted, it must be propagated to persons the entity knows have the information and who may have relied on it. For AI agents in healthcare, this means rectification must propagate to all clinical decision support systems, RAG knowledge bases, and care coordination platforms where the incorrect data may influence patient care.
| Field | Value |
|---|---|
| Severity Rating | Critical |
| Blast Radius | Individual to organisation-wide — a single incomplete erasure is an individual rights violation; systematic propagation failure across all erasure requests is an organisation-wide compliance failure |
Consequence chain: Incomplete erasure propagation means that personal data continues to exist in systems after the data subject has been told it was deleted. The immediate consequence is a rights violation: the data subject exercised a legal right and the organisation failed to honour it. The regulatory consequence under GDPR is enforcement action — the right to erasure is a core data subject right, and systematic failure to implement it demonstrates inadequate technical measures under Article 5(1)(f). The reputational consequence is severe: data subjects who discover that their data was not erased despite confirmation lose trust in the organisation entirely. For rectification, the consequences can be more immediate and dangerous: incorrect personal data in AI decision-making systems (clinical, financial, employment) can cause direct harm to the data subject. A patient whose blood type rectification does not propagate to the AI clinical decision support system faces a patient safety risk. An employee whose performance data rectification does not propagate to the AI workforce planning system faces potential career harm from decisions based on incorrect data. The organisational consequence of systematic propagation failure is a data protection authority investigation that may reveal broader compliance gaps, leading to supervisory measures including processing restrictions.
Cross-references: AG-013 (Data Sensitivity and Exfiltration Prevention) provides the data classification that enables comprehensive data inventories; AG-060 (Data Minimisation and Retention Governance) reduces the propagation surface by limiting where personal data exists; AG-059 (Lawful Basis and Consent Enforcement) governs the legal basis under which data was collected, which may affect erasure exemptions; AG-047 (Cross-Jurisdiction Compliance Governance) addresses jurisdiction-specific variations in erasure and rectification rights; AG-049 (Governance Decision Explainability) supports transparency when erasure exemptions are applied.