AG-491

Dependency Provenance and SBOM Attestation Governance

Third-Party, Supply Chain & Open Source ~21 min read AGS v2.1 · April 2026
EU AI Act SOX FCA NIST ISO 42001

2. Summary

Dependency Provenance and SBOM Attestation Governance requires that every organisation deploying AI agents maintains a complete, cryptographically verifiable inventory of all software dependencies, model artefacts, and transitive components consumed by those agents across every stage of the build, training, and deployment lifecycle. The inventory takes the form of a Software Bill of Materials (SBOM) that is machine-readable, continuously regenerated, and attested through signatures that bind the SBOM to the specific artefact it describes. Without this governance, an organisation cannot determine whether a deployed agent contains a compromised library, an unlicensed component, a recalled model weight, or a dependency that has been silently replaced by an adversary — creating an unquantifiable attack surface that grows with every untracked transitive dependency.

3. Example

Scenario A — Compromised Transitive Dependency Deployed Without Detection: A financial-value agent processes loan underwriting decisions for a mid-size bank. The agent's inference service depends on a machine learning framework that itself depends on 347 transitive packages. One of those packages — a tensor serialisation library at depth four in the dependency tree — is compromised through a maintainer account takeover. The attacker publishes a new patch version (3.2.17 to 3.2.18) containing an exfiltration payload that transmits serialised model inputs to an external endpoint. The bank's build pipeline uses floating version pins ("^3.2.0") and pulls the compromised version during a routine rebuild. Because no SBOM is generated or compared between builds, the new dependency version enters production without review. Over 19 days, the compromised library exfiltrates 14,200 loan application records including income, employment, and credit history data. The breach is discovered only when a network anomaly detector flags unusual egress traffic.

What went wrong: No SBOM was generated at build time, so no mechanism existed to detect the version change from 3.2.17 to 3.2.18. No cryptographic attestation bound the dependency inventory to the deployed artefact, so the organisation could not verify what was actually running. The floating version pin allowed automatic adoption of the compromised version. Consequence: 14,200 customer records exfiltrated, regulatory notification to three jurisdictions, £2.3 million in breach remediation, regulatory fine of £890,000 for inadequate supply chain controls, and 8-month consent order requiring supply chain governance implementation.

Scenario B — Model Artefact Substitution During Transfer: A safety-critical agent controlling pharmaceutical manufacturing quality checks receives a quarterly model update. The updated model weights are transferred from the training environment to the production inference cluster via an internal file share. During transfer, a misconfigured access control on the file share allows a contractor's automated backup process to overwrite the model file with a stale version from three months earlier. No provenance attestation exists for the model artefact — the deployment pipeline checks only that a file exists at the expected path with a matching filename. The stale model is deployed and operates for 6 weeks before a quality assurance review detects that the model's rejection rate for defective batches has dropped by 34%, meaning defective pharmaceutical batches are passing quality control. Investigation reveals the model substitution.

What went wrong: The model artefact had no provenance record linking it to a specific training run, dataset, or evaluation outcome. No cryptographic hash or signature was verified at deployment time. The deployment pipeline treated any file at the correct path as the correct model. Consequence: 6 weeks of degraded pharmaceutical quality control, recall of 12 batches at a cost of £4.7 million, regulatory investigation by the medicines authority, and suspension of the automated quality control system pending full provenance implementation.

Scenario C — Invisible Licence Violation in Dependency Chain: A public sector agent deployed by a government agency to process citizen benefit applications depends on a natural language processing library. That library's dependency tree includes 89 transitive packages, one of which uses a copyleft licence requiring that any derivative work be released under the same licence. The government agency's procurement rules prohibit copyleft-licensed software in citizen-facing systems due to intellectual property concerns. Because no SBOM inventories transitive dependency licences, the violation is not detected during procurement or deployment. Eighteen months later, an open-source advocacy organisation identifies the violation through reverse analysis of the agency's published container images. The agency must immediately decommission the agent, find an alternative library, retrain and revalidate the model, and redeploy — a process taking 5 months during which the benefit application system reverts to manual processing.

What went wrong: The SBOM either did not exist or did not capture licence metadata for transitive dependencies. Procurement review considered only the top-level library's licence, not the full dependency chain. No automated licence policy check was integrated into the build pipeline. Consequence: 5-month service disruption affecting 230,000 citizens, £1.8 million in emergency manual processing costs, parliamentary inquiry into procurement controls, and reputational damage to the digital transformation programme.

4. Requirement Statement

Scope: This dimension applies to every AI agent deployment where the agent's software stack, model artefacts, or runtime environment includes components sourced from third parties — whether commercial vendors, open-source projects, internal shared libraries, or pre-trained model repositories. The scope encompasses application-level dependencies (libraries, frameworks, SDKs), model artefacts (weights, embeddings, fine-tuning adapters, tokenizers), infrastructure components (container base images, runtime environments), and data pipeline components (preprocessing transforms, feature extractors). Any component not authored entirely within the deploying organisation's direct control is in scope. The scope extends to transitive dependencies at all depths — a dependency of a dependency is subject to the same provenance requirements as a direct dependency. Organisations that use managed services or platform-as-a-service offerings must obtain SBOMs from their providers or generate equivalent inventories through runtime analysis.

4.1. A conforming system MUST generate a machine-readable SBOM for every deployable artefact — application binaries, container images, model packages, and inference service bundles — at build time, capturing all direct and transitive dependencies with their exact versions, cryptographic hashes, and source locations.

4.2. A conforming system MUST cryptographically sign each SBOM using an identity-bound signing mechanism and bind the signature to both the SBOM content and the specific artefact it describes, such that any modification to either the SBOM or the artefact invalidates the attestation.

4.3. A conforming system MUST verify SBOM attestation signatures at deployment time, rejecting any artefact whose SBOM signature is missing, invalid, expired, or does not match the artefact being deployed.

4.4. A conforming system MUST include model artefacts in the SBOM or in a companion Model Bill of Materials (MBOM), recording the model's origin (training run identifier, dataset version, base model lineage), cryptographic hash, evaluation metrics at the time of approval, and any fine-tuning or adaptation applied.

4.5. A conforming system MUST continuously monitor all dependencies listed in production SBOMs against known vulnerability databases and security advisory feeds, triggering alerts within 24 hours of a new vulnerability disclosure affecting any listed component.

4.6. A conforming system MUST detect and alert on any dependency that enters the deployed artefact without appearing in the previous build's SBOM — a "dependency diff" that highlights all additions, removals, and version changes between consecutive deployments.

4.7. A conforming system MUST reject deployment of any artefact containing a dependency whose provenance cannot be verified — specifically, dependencies pulled from unattested sources, dependencies with no verifiable publisher identity, or dependencies whose cryptographic hashes do not match their declared source.

4.8. A conforming system SHOULD integrate licence metadata into the SBOM for every dependency at all transitive depths and enforce automated licence policy checks against the organisation's approved and prohibited licence lists during the build pipeline.

4.9. A conforming system SHOULD implement reproducible builds or build environment attestation such that a given set of source inputs deterministically produces the same artefact, enabling independent verification that the SBOM accurately represents the artefact's contents.

4.10. A conforming system MAY implement runtime SBOM verification — periodic checks during operation that compare the running artefact's actual loaded libraries, model files, and dependencies against the attested SBOM, detecting runtime substitution or injection.

5. Rationale

The software supply chain is the broadest and least visible attack surface for AI agent deployments. A modern AI agent's dependency graph routinely includes hundreds to thousands of components: machine learning frameworks, numerical computing libraries, serialisation utilities, HTTP clients, logging frameworks, data processing pipelines, tokenizers, embedding models, and dozens of their transitive dependencies. Each component represents a trust decision — the organisation is trusting that the component is what it claims to be, that it has not been tampered with, that it does not contain known vulnerabilities, and that its licence terms are compatible with the deployment context. Without a structured, verifiable inventory of these components, the organisation is making thousands of unverified trust decisions with every deployment.

The risk is not theoretical. Supply chain attacks against software dependencies have increased dramatically since 2020. The compromise of widely-used packages through maintainer account takeovers, typosquatting, and dependency confusion attacks has demonstrated that even well-known, heavily-used libraries can become attack vectors. For AI systems specifically, the supply chain risk extends beyond traditional software to model artefacts — pre-trained models, fine-tuning adapters, embedding models, and tokenizers are all artefacts that can be poisoned, substituted, or tampered with. A model artefact does not have the same transparency as source code; a poisoned model weight file looks identical to a legitimate one without cryptographic verification.

Regulatory expectations are converging on supply chain transparency. The EU AI Act's Article 15 requires robustness against attempts to alter the system's use or performance by exploiting system vulnerabilities, which explicitly includes supply chain manipulation. The EU Cyber Resilience Act mandates SBOMs for products with digital elements. The US Executive Order 14028 on cybersecurity requires SBOMs for software sold to the federal government. DORA Article 28 requires financial entities to manage ICT third-party risk, which includes the software components running on their infrastructure. ISO 42001 Clause 6.1 requires risk identification and treatment that encompasses supply chain risks. Organisations that cannot produce an SBOM for their AI agent deployments will face increasing regulatory friction across all major jurisdictions.

The SBOM is not merely an inventory — it is the evidentiary foundation for multiple downstream governance processes. Vulnerability management requires knowing what components are deployed. Licence compliance requires knowing what licences are in the dependency tree. Incident response requires knowing whether a compromised component is present in the organisation's deployments. Regulatory reporting requires demonstrating that the organisation knows what is running in its AI systems. Without the SBOM, all of these processes operate on incomplete information or assumptions. AG-491 establishes the SBOM as a first-class governance artefact with the same rigour applied to financial records or audit logs.

6. Implementation Guidance

Dependency Provenance and SBOM Attestation Governance requires organisations to treat every component in their AI agent stack as a governed artefact with a verifiable chain of custody from source to production. The implementation spans three domains: software dependency tracking, model artefact provenance, and continuous verification.

Recommended patterns:

Anti-patterns to avoid:

Industry Considerations

Financial Services. Financial regulators increasingly expect software supply chain transparency as part of operational resilience. DORA Article 28 on ICT third-party risk encompasses the software dependencies running financial processing systems. Financial institutions should implement SBOM requirements not only for AI agents but for all software components that influence financial decisions, creating a unified supply chain governance framework. The vulnerability monitoring requirement (4.5) is particularly critical — a compromised dependency in a financial processing agent could enable data exfiltration (Scenario A) or calculation manipulation.

Healthcare and Pharmaceutical. Model artefact provenance (Requirement 4.4) is especially critical in healthcare, where a model substitution (Scenario B) could directly affect patient safety. Medical device regulations (FDA, MDR) increasingly require software bill of materials for device software. Organisations should align their AI SBOM practices with existing medical device SBOM requirements to avoid maintaining parallel governance systems.

Public Sector. Government deployments face additional licence compliance requirements (Scenario C) because public procurement rules often restrict the use of certain licence types. SBOM-driven licence policy enforcement (Requirement 4.8) is essential for public sector deployments. Additionally, government deployments may be subject to national cybersecurity directives that mandate SBOMs, making AG-491 compliance a procurement prerequisite.

Crypto/Web3. Blockchain and decentralised application environments have unique supply chain risks including dependency confusion attacks targeting package registries, compromised smart contract libraries, and adversarial model weights distributed through decentralised storage. The cryptographic attestation requirements of AG-491 align well with the cryptographic verification culture of the Web3 ecosystem, but organisations must extend attestation to cover chain-specific components.

Maturity Model

Basic Implementation — The organisation generates machine-readable SBOMs at build time for all AI agent artefacts, covering direct and transitive software dependencies. SBOMs are cryptographically signed and verified at deployment. A dependency diff is generated between consecutive deployments. Known vulnerability databases are checked at build time. Model artefacts are hashed and the hash is recorded, but full model provenance (training run, dataset lineage) may be incomplete. This level meets the minimum mandatory requirements of 4.1 through 4.7.

Intermediate Implementation — All basic capabilities plus: full model provenance is recorded in an MBOM including training run identifiers, dataset versions, and evaluation metrics. Licence metadata is captured for all transitive dependencies and automated licence policy checks are enforced in the build pipeline. Continuous vulnerability monitoring correlates production SBOMs against advisory feeds with 24-hour alerting. Dependency diff reviews are mandatory for new dependencies and version changes. Build reproducibility is verified for at least the application layer.

Advanced Implementation — All intermediate capabilities plus: full build reproducibility or hermetic build attestation enables independent verification of artefact contents. Runtime SBOM verification periodically confirms that running components match the attested SBOM. Model artefact provenance covers the full lineage from base model through all fine-tuning and adaptation steps. The organisation participates in or consumes industry vulnerability sharing for AI-specific supply chain risks. SBOM data feeds into automated risk scoring that prioritises remediation based on deployment criticality and exposure.

7. Evidence Requirements

Required artefacts:

Retention requirements:

Access requirements:

8. Test Specification

Test 8.1: SBOM Completeness Verification

Test 8.2: Attestation Signature Verification

Test 8.3: Deployment Gate Rejection of Unattested Artefacts

Test 8.4: Dependency Diff Detection

Test 8.5: Vulnerability Alerting Timeliness

Test 8.6: Unverifiable Dependency Rejection

Test 8.7: Model Artefact Provenance Verification

Conformance Scoring

9. Regulatory Mapping

RegulationProvisionRelationship Type
EU AI ActArticle 15 (Accuracy, Robustness and Cybersecurity)Direct requirement
EU AI ActArticle 9 (Risk Management System)Supports compliance
SOXSection 404 (Internal Controls Over Financial Reporting)Supports compliance
FCA SYSC6.1.1R (Systems and Controls)Supports compliance
NIST AI RMFMAP 3.4, MANAGE 2.2, GOVERN 1.7Supports compliance
ISO 42001Clause 6.1 (Actions to Address Risks and Opportunities)Supports compliance
DORAArticle 28 (ICT Third-Party Risk)Direct requirement

EU AI Act — Article 15 (Accuracy, Robustness and Cybersecurity)

Article 15(4) requires high-risk AI systems to be resilient against attempts to alter their use or performance by exploiting system vulnerabilities, including supply chain vulnerabilities. A compromised dependency (Scenario A) or a substituted model artefact (Scenario B) is a direct exploitation of a supply chain vulnerability. The SBOM and its cryptographic attestation provide the evidentiary mechanism for demonstrating that the organisation has implemented technical measures against supply chain manipulation. Without an SBOM, the organisation cannot demonstrate compliance with the supply chain resilience requirement of Article 15.

SOX — Section 404 (Internal Controls Over Financial Reporting)

For financial-value agents that influence financial reporting (transaction processing, risk calculation, loan underwriting), the integrity of the software executing those calculations is a material internal control. SOX auditors assess whether controls exist to ensure that software has not been tampered with and operates as intended. An SBOM with cryptographic attestation provides auditable evidence that the specific software components involved in financial processing have been inventoried, verified, and monitored. The dependency diff requirement (4.6) ensures that changes to financial processing software are detected and reviewed.

FCA SYSC — 6.1.1R (Systems and Controls)

The FCA requires firms to maintain adequate systems and controls for the management of risks, including technology risks. The FCA's operational resilience framework expects firms to know what software is running in their important business services and to manage the risks arising from third-party software components. An SBOM programme provides the foundational visibility that the FCA expects, and the continuous vulnerability monitoring requirement (4.5) demonstrates active risk management of the software supply chain.

NIST AI RMF — MAP 3.4, MANAGE 2.2, GOVERN 1.7

NIST AI RMF MAP 3.4 addresses the identification of risks arising from third-party data and components, directly corresponding to the SBOM requirement for dependency inventories. MANAGE 2.2 addresses the management of AI risks including those from the AI supply chain. GOVERN 1.7 addresses the processes for managing AI system components throughout their lifecycle. AG-491 provides the operational mechanism for implementing these functions — the SBOM is the artefact through which third-party component risks are identified and managed.

ISO 42001 — Clause 6.1 (Actions to Address Risks and Opportunities)

ISO 42001 requires organisations to identify and address risks associated with their AI management system, including risks arising from third-party components. The SBOM provides the structured risk identification mechanism for software supply chain risks. The continuous vulnerability monitoring requirement ensures that newly discovered risks in existing components are identified and addressed, fulfilling the standard's expectation of ongoing risk management.

DORA — Article 28 (ICT Third-Party Risk)

DORA Article 28 requires financial entities to manage risks arising from ICT third-party service providers and components. While DORA primarily addresses outsourcing and third-party services, the regulation's scope encompasses ICT components that financial entities depend upon. Software dependencies are ICT components. The SBOM provides the inventory that DORA requires financial entities to maintain of their ICT dependencies, and the continuous monitoring requirement aligns with DORA's expectation of ongoing third-party risk assessment.

10. Failure Severity

FieldValue
Severity RatingCritical
Blast RadiusOrganisation-wide — a compromised dependency or model artefact may be present in multiple agents across multiple business functions; the same transitive dependency can appear in dozens of deployed artefacts simultaneously

Consequence chain: Failure to maintain dependency provenance and SBOM attestation creates an unquantifiable attack surface across all deployed AI agents. The immediate technical consequence is blindness — the organisation does not know what software and model components are running in its AI systems. This blindness cascades into multiple failure modes. A compromised dependency enters production undetected (Scenario A), exfiltrating data or manipulating outputs for days or weeks before an unrelated detection mechanism discovers the breach. A model artefact is substituted without detection (Scenario B), degrading critical safety or quality controls for the time between substitution and discovery. A licence violation in a transitive dependency triggers forced decommissioning of a production system (Scenario C), causing service disruption. The organisation cannot respond effectively to vulnerability disclosures because it cannot determine which of its deployments contain the affected component, turning a routine patching exercise into an emergency investigation. The blast radius is organisation-wide because the same popular dependency may appear in dozens of deployed artefacts — a single compromised package affects every agent that depends on it. The regulatory consequence compounds the operational impact: regulators in financial services (DORA), healthcare (MDR), and AI governance (EU AI Act) increasingly require supply chain transparency, and the inability to produce an SBOM on regulatory request demonstrates a governance failure that invites enforcement action.

Cross-references: AG-006 (Tamper-Evident Record Integrity), AG-407 (Build Pipeline Attestation Governance), AG-489 (Open-Source Licence Policy Binding Governance), AG-490 (Maintainer Trust and Project Health Governance), AG-494 (Vendor Incident Disclosure Governance), AG-495 (Procurement Security Requirement Governance), AG-405 (Secure Model Artifact Transport Governance), AG-408 (Infrastructure Drift Detection Governance).

Cite this protocol
AgentGoverning. (2026). AG-491: Dependency Provenance and SBOM Attestation Governance. The 783 Protocols of AI Agent Governance, AGS v2.1. agentgoverning.com/protocols/AG-491