Explainability is increasingly expected to support not only interpretation, but also accountability, human oversight, and auditability in high-risk Artificial Intelligence (AI) systems. However, in many deployments, explanations are generated as isolated technical reports, remaining weakly connected to decision provenance, governance actions, audit logs,
[...] Read more.
Explainability is increasingly expected to support not only interpretation, but also accountability, human oversight, and auditability in high-risk Artificial Intelligence (AI) systems. However, in many deployments, explanations are generated as isolated technical reports, remaining weakly connected to decision provenance, governance actions, audit logs, and regulatory documentation. This short communication introduces
XAI-Compliance-by-Design, a modular engineering framework for explainable artificial intelligence (XAI) systems that routes explainability outputs and related technical traces into structured, audit-ready evidence throughout the AI lifecycle, designed to align with key obligations under the European Union Artificial Intelligence Act (EU AI Act) and the General Data Protection Regulation (GDPR). The framework specifies (i) a modular architecture that separates technical evidence generation from governance consumption through explicit interface points for emitting, storing, and querying evidence, and (ii) a Technical–Regulatory Correspondence Matrix—a mapping table linking regulatory anchors to concrete evidence artefacts and governance triggers. As this communication does not report measured results, it also introduces an
Evidence-by-Design evaluation protocol defining measurable indicators, baseline configurations, and required artefacts to enable reproducible empirical validation in future work. Overall, the contribution is a practical blueprint that clarifies what evidence must be produced, where it is generated in the pipeline, and how it supports continuous compliance and auditability efforts without relying on post hoc explanations.
Full article