verify third party AI tool compliance with EU AI Act

1. Preliminary Assessment

2. Documentation Review

3. Risk Management

  • Review documentation of the risk management framework.
  • Evaluate the framework's alignment with EU AI Act requirements.
  • Identify risk assessment methodologies used.
  • Check for involvement of relevant stakeholders in the framework.
  • Examine records of risk mitigation strategies applied.
  • Confirm strategies address identified risks effectively.
  • Assess the adequacy of resources allocated for mitigation.
  • Review outcomes of implemented strategies for effectiveness.
  • Establish a monitoring plan for ongoing risk assessment.
  • Define key performance indicators for risk management.
  • Schedule regular reviews of risk monitoring results.
  • Ensure transparency in reporting monitoring outcomes.

4. Data Governance

  • Identify all datasets utilized for training.
  • Verify the authenticity and reliability of data sources.
  • Check for consent or rights to use the data.
  • Document the origin and characteristics of datasets.
  • Assess data diversity to minimize bias.
  • Review the AI tool's data processing agreements.
  • Confirm data subjects' rights are upheld.
  • Implement data protection impact assessments (DPIAs).
  • Establish processes for data breach notifications.
  • Ensure data is securely stored and easily retrievable.
  • Evaluate the AI's data collection practices.
  • Ensure data collected is relevant and necessary.
  • Review data retention policies for compliance.
  • Check for user options to limit data usage.
  • Analyze how data is used in AI decision-making.

5. Accountability and Liability

6. Human Oversight

7. Compliance Certification

8. Reporting and Transparency

9. Post-Implementation Review

10. Continuous Improvement

Related Checklists