Neuro-symbolic Computation for XAI: Towards a Unified Model


pagemagnifierBibTeX_logo.pngmagnifierpage_white_acrobatmagnifier

Giuseppe Pisano, Giovanni Ciatto, Roberta Calegari, Andrea Omicini

Roberta Calegari, Giovanni Ciatto, Enrico Denti, Andrea Omicini, Giovanni Sartor (eds.)
“WOA 2020 – 21th Workshop “From Objects to Agents"”, pages 101–117
CEUR Workshop Proceedings (AI*IA Series) 2706
Sun SITE Central Europe, RWTH Aachen University, Aachen, Germany
October 2020

The idea of integrating symbolic and sub-symbolic approaches to make intelligent systems (IS) understandable and explainable is at the core of new fields such as neuro-symbolic computing (NSC). This work lays under the umbrella of NSC, and aims at a twofold objective. First, we present a set of guidelines aimed at building explainable IS, which leverage on logic induction and constraints to integrate symbolic and sub-symbolic approaches. Then, we reify the proposed guidelines into a case study to show their effectiveness and potential, presenting a prototype built on the top of some NSC technologies.

(keywords) XAI, Hybrid Systems, Neural Networks, Logical Constraining

21st Workshop “From Objects to Agents" (WOA 2020), Bologna, Italy, 14–16 September 2020. Proceedings

Talks

Journals & Series

Events

  • 21st Workshop “From Objects to Agents” (WOA 2020) — 14/09/2020–16/09/2020

Publications

Publication

— authors

— editors

— status

published

— sort

paper in proceedings

— publication date

October 2020

— volume

WOA 2020 – 21th Workshop “From Objects to Agents"

— series

CEUR Workshop Proceedings / AI*IA Series

— volume

2706

— pages

101–117

— article no.

8

— number of pages

17

— address

Aachen, Germany

— location

Bologna, Italy

URLs

original page

identifiers

— DBLP

conf/woa/PisanoCCO20

— IRIS

11585/781387

— Scopus

2-s2.0-85095616404

— print ISSN

1613-0073

notes

— note

21st Workshop “From Objects to Agents" (WOA 2020), Bologna, Italy, 14–16 September 2020. Proceedings

Partita IVA: 01131710376 — Copyright © 2008–2023 APICe@DISI – PRIVACY