E004—Assign accountability
>Control Description
Document which AI system changes across the development & deployment lifecycle require formal review or approval, assign a lead accountable for each, and document their approval with supporting evidence
Application
Mandatory
Frequency
Every 12 monthsCapabilities
Universal
>Controls & Evidence (2)
Operational Practices
E004.1
Documentation: Change approval policy and recordsCore - This should include:
- Defining AI system changes requiring approval including model selection, material changes to the meta prompt, adding / removing guardrails, changes to end-user workflow, other changes that drive material. For example, +/-10% performance on evals. - Assigning an accountable lead as approver for each of these changes. Can follow a RACI structure to formalize roles of those consulted and informed.
Typical evidence: Documentation or policy defining which AI system changes require approval with assigned accountable leads, and approval records showing sign-offs with supporting evidence. Can be a change management policy, overview table in e.g. Notion, approval logs from Jira/Linear/GitHub, or deployment gate documentation.
Location: Internal policies
Technical Implementation
E004.2
Config: Code signing implementationSupplemental - This may include:
- Implementing code signing and verification processes for AI models, libraries, and deployment artefacts to ensure only digitally signed components are approved for production use.
Typical evidence: Screenshot of code signing configuration, CI/CD pipeline requiring signed artifacts, or verification process for AI components - may include model signing process, signature verification in deployment pipeline, artifact registry showing signed models/libraries, or policy enforcement blocking unsigned components from production.
Location: Engineering Code, Engineering Practice
>Cross-Framework Mappings
NIST AI RMF
Ask AI
Configure your API key to use AI features.