On the integration of symbolic and sub-symbolic techniques for XAI: A survey


Roberta Calegari, Giovanni Ciatto, Andrea Omicini

Intelligenza Artificiale 14(1), pages 7-32, 26 pages, September 2020
IOS Press
Matteo Baldoni, Federico Bergenti, Stefania Monica, Giuseppe Vizzari (eds.)
Special Issue for the Twentieth Edition of the Workshop ‘From Objects to Agents’

The more intelligent systems based on sub-symbolic techniques pervade our everyday lives, the less human can understand them. This is why symbolic approaches are getting more and more attention in the general effort to make AI interpretable, explainable, and trustable. Understanding the current state of the art of AI techniques integrating symbolic and sub-symbolic approaches is then of paramount importance, nowadays—in particular in the XAI perspective. This is why this paper provides an overview of the main symbolic/sub-symbolic integration techniques, focussing in particular on those targeting explainable AI systems.

(keywords) XAI, symbolic and sub-symbolic AI, explainability, interpretability, trustable system
 @article{xaisurvey-ia14,
author = {Calegari, Roberta and Ciatto, Giovanni and Omicini, Andrea},
booktitle = {Special Issue for the Twentieth Edition of the Workshop `From Objects to Agents'},
dblpid = {journals/ia/CalegariCO20},
doi = {10.3233/IA-190036},
editor = {Baldoni, Matteo and Bergenti, Federico and Monica, Stefania and Vizzari, Giuseppe},
irisid = {11585/772707},
journal = {Intelligenza Artificiale},
keywords = {XAI, symbolic and sub-symbolic AI, explainability, interpretability, trustable system},
number = 1,
pages = {7--32},
publisher = {IOS Press},
scopusid = {2-s2.0-85092388493},
title = {On the integration of symbolic and sub-symbolic techniques for {XAI}: A survey},
url = {http://content.iospress.com/articles/intelligenza-artificiale/ia190036},
volume = 14,
wosid = {000574865700002},
year = 2020

Journals & Series

Events

  • XX Workshop "From Objects to Agents" (WOA 2019) — 26/06/2019–28/06/2019

Partita IVA: 01131710376 - Copyright © 2008-2022 APICe@DISI Research Group - PRIVACY