European Data Protection Board
View original resourceThe European Data Protection Board's AI Auditing Checklist is a specialized template designed specifically for auditing machine learning algorithms through the lens of data protection compliance. Unlike general AI governance frameworks, this checklist takes a granular approach to evaluating AI systems across their entire lifecycle—from initial data preprocessing through algorithm training to operational deployment. What sets this resource apart is its focus on practical auditing mechanics: it provides scoring systems, specific checkpoints, and detailed criteria that auditors can apply immediately. The checklist bridges the gap between high-level GDPR principles and the technical realities of machine learning systems, making it an essential tool for organizations that need to demonstrate compliance in concrete, measurable ways.
This checklist emerged from the EDPB's Special Programme on AI, representing a shift from abstract guidance to actionable audit tools. The timing is deliberate—released in 2024 as organizations scramble to prepare for the EU AI Act's implementation requirements. The EDPB recognized that while plenty of guidance exists on AI ethics and risk management, there was a critical gap in practical tools for conducting systematic data protection audits of AI systems. This resource fills that void by providing a standardized approach that can be applied consistently across different types of machine learning implementations.
Lifecycle-Based Structure: Rather than organizing by risk categories or ethical principles, the checklist follows the actual workflow of ML development—preprocessing, training, validation, deployment, and monitoring. This makes it intuitive for technical teams to use.
Scoring Methodology: Each audit criterion includes specific scoring guidelines, allowing organizations to quantify compliance levels rather than relying on subjective assessments. This is particularly valuable for regulatory reporting and internal risk management.
Data Processing Focus: While other frameworks address AI broadly, this checklist zeroes in on data processing activities, making it directly applicable to GDPR Article 35 (DPIA requirements) and Article 25 (data protection by design).
Technical Depth: The checklist includes specific technical considerations like data anonymization techniques, algorithmic transparency measures, and automated decision-making safeguards that other general-purpose frameworks often skip.
Primary Users:
Secondary Users:
Not Ideal For:
Phase 1 - System Inventory: Use the checklist to catalog all ML systems currently in development or production, identifying which stages of the lifecycle each system has completed.
Phase 2 - Gap Analysis: Apply the scoring methodology to identify specific compliance gaps across different lifecycle stages. Focus first on production systems with high data subject impact.
Phase 3 - Remediation Planning: Use the detailed criteria as a roadmap for addressing identified gaps. The checklist's structure makes it easy to assign specific remediation tasks to appropriate teams.
Phase 4 - Ongoing Monitoring: Integrate key checklist items into regular audit cycles and system change management processes to maintain compliance as systems evolve.
This checklist works best when embedded within a larger AI governance framework rather than used in isolation. It complements the EU AI Act's risk assessment requirements by providing detailed methodology for the data protection aspects of AI risk management. Organizations using frameworks like ISO/IEC 23053 or NIST AI RMF can use this checklist to add GDPR-specific depth to their assessment processes.
The scoring system also enables trend analysis over time, helping organizations demonstrate continuous improvement in their AI governance practices—something increasingly important for regulatory relationships and stakeholder trust.
Published
2024
Jurisdiction
European Union
Category
Assessment and evaluation
Access
Public access
Artificial Intelligence and Data Act
Regulations and laws • Government of Canada
The Artificial Intelligence and Data Act (AIDA) – Companion document
Regulations and laws • Innovation, Science and Economic Development Canada
ISO/IEC 23053:2022 - Framework for AI systems using machine learning
Standards and certifications • ISO/IEC
VerifyWise helps you implement AI governance frameworks, track compliance, and manage risk across your AI systems.