AG-489

Open-Source Licence Policy Binding Governance

Third-Party, Supply Chain & Open Source ~20 min read AGS v2.1 · April 2026
EU AI Act SOX FCA NIST ISO 42001

2. Summary

Open-Source Licence Policy Binding Governance requires that every organisation deploying AI agents maintains a formally enforced policy that maps open-source licence obligations to permissible use contexts, prevents incorporation of components whose licences conflict with the agent's deployment model, and continuously validates compliance as dependencies evolve. Open-source libraries are foundational to modern AI agent stacks — inference runtimes, vector databases, orchestration frameworks, embedding models, and tool integrations routinely depend on dozens to hundreds of open-source packages, each carrying licence terms that impose obligations ranging from attribution notices to full source disclosure. This dimension mandates that licence obligations are treated as binding governance constraints, not informal guidance, and that violations are detected and blocked before deployment rather than discovered retroactively through litigation or regulatory action.

3. Example

Scenario A — Copyleft Contamination in a Proprietary Agent Product: A fintech company builds a customer-facing AI agent that provides investment portfolio recommendations. The development team integrates an open-source natural language processing library licensed under AGPL-3.0 to improve entity extraction from financial documents. AGPL-3.0 requires that any software interacting with users over a network must make its complete source code available. The agent is deployed as a proprietary SaaS product serving 14,000 customers. Seven months after deployment, a competitor files a complaint with the Software Freedom Conservancy. Legal review confirms the AGPL obligation applies. The company faces a choice: release the entire agent's source code (including proprietary trading algorithms worth an estimated £4.2 million in development costs) or remove the AGPL component and rebuild the entity extraction pipeline. The rebuild takes 4 months and costs £680,000 in engineering time, plus £190,000 in legal fees. During the remediation period, the competitor gains market share worth approximately £1.3 million in annualised revenue.

What went wrong: No licence policy existed that classified AGPL-3.0 as incompatible with proprietary SaaS deployment. The development team evaluated the library on technical merit alone. No automated licence scanner was integrated into the build pipeline. The AGPL component entered the dependency tree through a transitive dependency — the team installed a document parsing library (MIT-licensed) that depended on the AGPL NLP library. The transitive dependency was invisible without SBOM analysis.

Scenario B — Attribution Violations Trigger Public Sector Contract Termination: A government agency contracts an AI systems integrator to deploy an internal workflow agent for benefits eligibility determination. The agent incorporates 47 open-source components, 31 of which carry attribution requirements (Apache-2.0, BSD-2-Clause, MIT). The integrator's deployment package includes no NOTICE file, no attribution documentation, and no licence texts. The agency's procurement audit — triggered by a routine annual review — discovers the missing attributions. The contract terms require compliance with all applicable intellectual property obligations. The agency issues a cure notice with a 30-day remediation window. The integrator scrambles to reconstruct attribution information for all 47 components, but discovers that 3 components were incorporated from forks whose licence terms were modified by the fork maintainers. The modified terms include a non-commercial-use restriction incompatible with the government's operational use. The integrator must replace the 3 components, re-test the agent, and redeploy. Total remediation cost: £145,000. The agency imposes a £60,000 contractual penalty and places the integrator on a 12-month enhanced oversight programme, restricting their eligibility for future contracts.

What went wrong: The integrator had no licence policy, no attribution tracking process, and no fork-awareness in their dependency management. Attribution obligations were treated as administrative overhead rather than contractual requirements. Forked repositories with modified licence terms were not detected because the integrator relied on the original repository's licence without verifying the fork's terms.

Scenario C — Dual-Licence Misinterpretation Causes Crypto Agent Shutdown: A decentralised finance (DeFi) protocol deploys an AI agent for automated liquidity pool management. The agent uses a database engine offered under a dual licence: open-source (SSPL) for community use and commercial licence for production deployment. The development team selects the SSPL option, interpreting "community use" to include their deployment because the DeFi protocol is "community-governed." SSPL requires that any service offering the software's functionality to third parties must release the entire service stack's source code. The liquidity management agent serves 8,200 wallet holders — unambiguously third-party service provision. A legal cease-and-desist from the database vendor arrives 5 months after launch. The protocol must either purchase a commercial licence (£340,000 annual fee, with retroactive licence fees of £170,000 for the unlicensed period) or migrate to an alternative database, requiring 6 weeks of downtime during which £23 million in locked liquidity cannot be actively managed. The protocol votes for the commercial licence, costing £510,000 total.

What went wrong: The dual-licence terms were misinterpreted. No legal review of the SSPL was conducted before adoption. The licence policy did not classify SSPL or identify its service-provision trigger. The "community-governed" rationale was an engineering assumption, not a legal determination. No pre-deployment licence review gate existed.

4. Requirement Statement

Scope: This dimension applies to every AI agent deployment that incorporates, links to, distributes, or provides network access to open-source software components, including inference engines, model weights with open-source licences, orchestration frameworks, database engines, API client libraries, utility libraries, and any transitive dependencies thereof. The scope includes components incorporated at build time, components loaded at runtime, components fetched dynamically during operation (e.g., tool plugins from open registries), and model weights whose licence terms impose conditions on derivative works or commercial use. An agent that uses no open-source components whatsoever is exempt, but this is vanishingly rare in practice — virtually every AI agent stack includes open-source dependencies. The scope extends to all environments: development, staging, testing, and production. Licence violations in any environment create legal exposure.

4.1. A conforming system MUST maintain a canonical licence policy that classifies every open-source licence encountered in the dependency tree into one of at least three categories: permitted (no restrictions that conflict with the agent's deployment model), restricted (permitted subject to specific conditions that must be satisfied), and prohibited (incompatible with the agent's deployment model or organisational policy). The policy MUST be versioned with change history and approved by both legal counsel and engineering leadership.

4.2. A conforming system MUST perform automated licence detection on every direct and transitive dependency at build time, comparing each detected licence against the canonical licence policy, and MUST block build completion when a prohibited licence is detected or when a licence cannot be identified.

4.3. A conforming system MUST maintain a complete and current inventory of all open-source components incorporated in each deployed agent, including component name, version, licence identifier (SPDX where available), source repository, and the dependency path through which the component was incorporated (direct or transitive via which parent).

4.4. A conforming system MUST generate and distribute attribution documentation (NOTICE files, licence texts, copyright notices) for every deployed agent in accordance with the obligations of each incorporated component's licence, and MUST verify the completeness of this documentation before each release.

4.5. A conforming system MUST implement a pre-deployment licence review gate that prevents any agent from being deployed to production if the licence compliance check has not passed within a defined recency window (recommended: no older than 7 days for continuous deployment, no older than 30 days for periodic release cycles).

4.6. A conforming system MUST detect and evaluate licence changes in upstream dependencies — including licence reclassification by maintainers, licence modifications in forks, and dual-licence term changes — within 72 hours of the change being published and MUST trigger a re-evaluation of affected agents against the canonical policy.

4.7. A conforming system MUST record the legal rationale for every licence classification in the canonical policy, including the specific deployment characteristics that make a licence compatible or incompatible, and MUST update these rationales when the agent's deployment model changes.

4.8. A conforming system SHOULD implement continuous licence monitoring for runtime-fetched dependencies (tool plugins, dynamically loaded models, API-accessed components) that validates licence compliance before each runtime incorporation.

4.9. A conforming system SHOULD maintain a pre-approved component catalogue — a curated list of open-source components that have passed full licence review — to accelerate development while maintaining compliance. Components not in the catalogue trigger mandatory review before adoption.

4.10. A conforming system SHOULD implement licence compatibility analysis for multi-component interactions, detecting cases where individually permitted licences create conflicts when combined (e.g., GPL-2.0-only combined with Apache-2.0 in certain linking configurations).

4.11. A conforming system MAY implement automated licence obligation fulfilment — generating NOTICE files, licence bundles, and source code offers programmatically from the component inventory and licence policy.

5. Rationale

Open-source software is the substrate of the modern AI agent stack. Inference runtimes, embedding models, vector stores, orchestration frameworks, retrieval-augmented generation pipelines, tool-calling libraries, serialisation formats, and cryptographic primitives are overwhelmingly delivered as open-source components. A typical enterprise AI agent deployment incorporates between 200 and 2,000 open-source packages when transitive dependencies are counted. Each package carries licence terms — legal obligations that govern how the software may be used, modified, distributed, and offered as a service.

The governance challenge is that licence obligations are not optional, advisory, or aspirational. They are legally binding conditions imposed by copyright holders. Violation of open-source licence terms creates legal exposure identical in kind to violation of any other software licence: copyright infringement claims, injunctive relief (forced removal of the component), damages, and in some jurisdictions, criminal penalties. The unique risk in the AI agent context is threefold.

First, the dependency tree is deep and opaque. A development team that consciously selects an MIT-licensed orchestration framework may be unaware that the framework transitively depends on a GPL-3.0-licensed parser, an AGPL-3.0-licensed analytics module, and an SSPL-licensed database driver. Licence obligations propagate through transitive dependencies, and the most restrictive licence in the dependency tree determines the obligations for the entire work (in copyleft licence families). Without automated detection across the full transitive tree, copyleft contamination is virtually undetectable by manual review.

Second, the deployment model determines licence impact. The same component may be fully compliant in an internal development tool (where GPL source disclosure obligations are satisfied by internal availability) and catastrophically non-compliant in a customer-facing SaaS agent (where AGPL network interaction provisions trigger full source disclosure). AI agents increasingly operate as network-accessible services, which triggers the broadest interpretation of copyleft obligations. An organisation that moves an agent from internal use to customer-facing deployment without re-evaluating licence compliance may inadvertently trigger obligations that require disclosing proprietary source code.

Third, the AI supply chain introduces novel licence categories. Model weights are increasingly distributed under custom licences (e.g., Llama Community Licence, Mistral licence terms) that impose use restrictions not present in traditional software licences — restrictions on commercial use, use in specific domains, use above revenue thresholds, or use in specific geographies. These licences do not map cleanly to traditional open-source licence taxonomies and require bespoke legal analysis. An organisation that treats model weight licences identically to software licences will miss critical restrictions.

The regulatory context reinforces the governance requirement. The EU AI Act's transparency obligations (Article 13) require disclosure of components used in high-risk AI systems. SOX Section 404 requires that internal controls over financial reporting are effective, which includes ensuring that the software implementing those controls is legally deployable. The EU Cyber Resilience Act (anticipated enforcement) will impose SBOM and licence documentation obligations on products with digital elements. Organisations that lack licence governance today will face mandatory compliance requirements within 24 months across multiple jurisdictions.

The preventive control type is deliberate. Licence violations are far cheaper to prevent than to remediate. Scenario A illustrates: £680,000 in rebuild costs plus £190,000 in legal fees versus an estimated £15,000 annual cost for automated licence scanning and policy enforcement. The return on prevention exceeds 50:1.

6. Implementation Guidance

Open-Source Licence Policy Binding Governance requires a systematic approach that connects legal analysis (licence classification) to engineering enforcement (build-time and runtime checks) and operational maintenance (continuous monitoring for licence changes). The core principle is that licence compliance is a continuous obligation, not a point-in-time assessment.

Recommended patterns:

Anti-patterns to avoid:

Industry Considerations

Financial Services. Financial agents processing transactions, generating reports, or providing investment advice must ensure that all software components are legally deployable. A licence violation in a trading agent that forces component removal creates operational risk — the agent may need to be taken offline during remediation, disrupting trading operations. Financial regulators expect software supply chain governance as part of operational resilience (DORA Article 28).

Crypto/Web3. Decentralised protocols face unique licence challenges. SSPL and AGPL obligations are triggered by network service provision, which is inherent to blockchain-based services. Smart contract libraries, consensus mechanism implementations, and bridge components frequently carry copyleft licences. The decentralised governance model (DAO voting) may slow licence remediation, making prevention essential.

Public Sector. Government deployments face procurement regulations that mandate intellectual property compliance. Licence violations can trigger contract termination, debarment from future procurement, and public disclosure requirements. Public sector agents handling citizen data must ensure that open-source licence terms do not conflict with data sovereignty requirements.

Safety-Critical / CPS. Agents controlling physical systems (robotics, autonomous vehicles, industrial automation) face safety certification requirements that may be invalidated if the software supply chain includes unlicensed components. Regulatory bodies may refuse certification for systems with unresolved licence disputes.

Maturity Model

Basic Implementation — The organisation maintains a canonical licence policy classifying licences into permitted, restricted, and prohibited categories. Automated licence scanning is integrated into the build pipeline and blocks prohibited licences. A complete component inventory with licence identifiers is maintained for each deployed agent. Attribution documentation is generated before each release. Legal counsel has reviewed and approved the policy.

Intermediate Implementation — All basic capabilities plus: transitive dependency analysis with full dependency-path tracking. Fork-aware licence verification. Licence change monitoring for upstream dependencies with 72-hour detection. Deployment-context-aware policy (different rules for internal vs. customer-facing agents). Pre-approved component catalogue for accelerated adoption. Model weight licence tracking separate from software licence tracking.

Advanced Implementation — All intermediate capabilities plus: runtime licence validation for dynamically fetched components. Automated licence compatibility analysis for multi-component interactions. Licence obligation fulfilment automation (NOTICE generation, source code offer management). Integration with AG-491 SBOM attestation for end-to-end supply chain licence provenance. Cross-jurisdictional licence interpretation mapping (e.g., how copyleft is interpreted differently in EU vs. US courts). Proactive licence risk scoring that flags dependencies with high relicensing probability based on maintainer signals.

7. Evidence Requirements

Required artefacts:

Retention requirements:

Access requirements:

8. Test Specification

Test 8.1: Prohibited Licence Build Blocking

Test 8.2: Transitive Dependency Licence Detection

Test 8.3: Attribution Documentation Completeness

Test 8.4: Licence Change Detection Timeliness

Test 8.5: Pre-Deployment Gate Enforcement

Test 8.6: Policy Classification Legal Rationale Presence

Test 8.7: Unknown Licence Fail-Closed Behaviour

Conformance Scoring

9. Regulatory Mapping

RegulationProvisionRelationship Type
EU AI ActArticle 13 (Transparency)Direct requirement
EU AI ActArticle 15 (Accuracy, Robustness and Cybersecurity)Supports compliance
SOXSection 404 (Internal Controls Over Financial Reporting)Supports compliance
FCA SYSC8.1 (Outsourcing and Third-Party Arrangements)Supports compliance
NIST AI RMFGOVERN 1.5, MAP 3.4Supports compliance
ISO 42001Clause 8.4 (Externally Provided Processes, Products and Services)Direct requirement
DORAArticle 28 (ICT Third-Party Risk)Supports compliance

EU AI Act — Article 13 (Transparency)

Article 13 requires providers of high-risk AI systems to provide sufficient transparency for deployers to interpret and use the system appropriately. This includes information about the components used in the system. Open-source licence obligations directly affect how a high-risk AI system may be deployed, modified, and distributed. A deployer who is unaware that a system component carries AGPL-3.0 obligations cannot make informed decisions about deployment. AG-489 ensures that the licence status of all components is documented, communicated, and enforceable, directly supporting the transparency requirement.

SOX — Section 404 (Internal Controls Over Financial Reporting)

Financial reporting systems implemented by AI agents must be legally deployable. A licence violation that forces removal of a component from a financial processing agent creates a control disruption — the agent may need to be taken offline, reports may be delayed, and the control environment is degraded during remediation. SOX auditors assess whether the organisation has adequate controls over its technology supply chain, including software licensing. AG-489 provides the preventive control that ensures licence compliance before deployment, avoiding the operational disruption of post-deployment remediation.

FCA SYSC — 8.1 (Outsourcing and Third-Party Arrangements)

The FCA requires firms to manage risks arising from third-party dependencies, including open-source software that forms part of regulated systems. An open-source licence violation that forces component removal from a customer-facing financial agent is an operational resilience event. The FCA expects firms to understand and manage the legal risks of their technology dependencies, including licence obligations that may affect continuity of service.

DORA — Article 28 (ICT Third-Party Risk)

The Digital Operational Resilience Act requires financial entities to manage ICT third-party risk, including risk from open-source components. Article 28 mandates that organisations maintain registers of ICT third-party arrangements and assess the risks of each arrangement. Open-source licence non-compliance is a third-party risk that can result in forced component removal, operational disruption, and legal liability. AG-489 provides the governance framework for identifying, assessing, and mitigating open-source licence risk in alignment with DORA requirements.

ISO 42001 — Clause 8.4 (Externally Provided Processes, Products and Services)

ISO 42001 requires organisations to ensure that externally provided processes, products, and services (including open-source components) conform to requirements. Licence compliance is a fundamental conformance requirement for open-source components. AG-489 provides the systematic approach to verifying and maintaining licence compliance across all externally sourced components in the AI agent stack.

NIST AI RMF — GOVERN 1.5, MAP 3.4

GOVERN 1.5 addresses organisational policies for AI risk management, which must include intellectual property and licensing risk. MAP 3.4 addresses the identification of third-party AI components and their associated risks, including legal risks from licence non-compliance. AG-489 operationalises these requirements by mandating a canonical licence policy, automated detection, and continuous monitoring.

10. Failure Severity

FieldValue
Severity RatingHigh
Blast RadiusOrganisation-wide for policy failures; agent-specific for individual component violations, but with potential propagation to all agents sharing the non-compliant component

Consequence chain: A prohibited open-source licence enters the agent's dependency tree undetected, creating an undiscovered legal obligation. The immediate technical exposure is latent — the agent functions correctly, and no operational symptom appears. The legal exposure, however, is active from the moment of deployment. Discovery triggers a forced remediation cycle: legal analysis of the obligation's scope, engineering assessment of removal difficulty (which may be high for deeply integrated components), replacement component selection and integration, regression testing, and redeployment. If the licence is copyleft with source-disclosure obligations, the organisation faces a binary choice between disclosing proprietary source code (potentially destroying competitive advantage) and removing the component (incurring engineering cost and operational disruption). For customer-facing agents, the remediation period may require service degradation or suspension. For financial agents, component removal may disrupt control environments, triggering regulatory reporting obligations under DORA and SOX. Attribution violations — less severe individually — create cumulative legal exposure that increases with each unattributed component and each deployment year. The failure compounds over time because organisations that lack licence governance tend to accumulate non-compliant components progressively, with each new dependency increasing the eventual remediation cost. Early detection through AG-489 controls breaks this accumulation cycle.

Cross-references: AG-007 (Governance Configuration Control), AG-370 (Tool Schema Integrity Governance), AG-407 (Build Pipeline Attestation Governance), AG-490 (Maintainer Trust and Project Health Governance), AG-491 (Dependency Provenance and SBOM Attestation Governance), AG-495 (Procurement Security Requirement Governance), AG-498 (Upstream Policy Compatibility Governance).

Cite this protocol
AgentGoverning. (2026). AG-489: Open-Source Licence Policy Binding Governance. The 783 Protocols of AI Agent Governance, AGS v2.1. agentgoverning.com/protocols/AG-489