Model-Based Contrastive Explanations for XAIP: Towards a General Model and Prototype


pagemagnifierpage_white_acrobatmagnifier

Giulia Brugnatti

Planning is an important sub-field of artificial intelligence (AI) focusing on letting intelligent agents deliberate on the most adequate course of action to attain their goals.
Thanks to the recent boost in the number of critical domains and systems which exploit planning for their internal procedures, there is an increasing need for planning systems to become more transparent and trustworthy.
Along this line, planning systems are now required to produce not only plans but also explanations about those plans, or the way they were attained.
To address this issue, a new research area is emerging in the AI panorama: eXplainable AI (XAI), within which explainable planning (XAIP) is a pivotal sub-field.

As a recent domain, XAIP is far from mature.
No consensus has been reached in the literature about what explanations are, how they should be computed, and what they should explain in the first place.
Furthermore, existing contributions are mostly theoretical, and software implementations are rarely more than preliminary.

To overcome such issues, in this thesis we design an explainable planning framework bridging the gap between theoretical contributions from literature and software implementations.
More precisely, taking inspiration from the state of the art, we develop a formal model for XAIP, and the software tool enabling its practical exploitation.

Accordingly, the contribution of this thesis is four-folded.
First, we review the state of the art of XAIP, supplying an outline of its most significant contributions from the literature.
We then generalise the aforementioned contributions into a unified model for XAIP, aimed at supporting model-based contrastive explanations.
Next, we design and implement a pure-Kotlin, algorithm-agnostic library for XAIP based on our model.
Finally, we validate our library from a technological perspective, via an extensive testing suite.
Furthermore, we assess its performance and usability through a set of benchmarks and end-to-end examples.

Tags:

Thesis

— thesis student

supervision

— supervisors

Giovanni Ciatto

— co-supervisors

Andrea Omicini

sort

— cycle

second-cycle thesis

— status

completed thesis

— language

wgb.gif

dates

— activity started

01/03/2022

— degree date

15/12/2022

files

PDF

Partita IVA: 01131710376 — Copyright © 2008–2023 APICe@DISI – PRIVACY