Roberta Calegari, Giovanni Ciatto, Enrico Denti, Andrea Omicini, Giovanni Sartor (eds.)
WOA 2021 – 22nd Workshop “From Objects to Agents”, pages 98–115
CEUR Workshop Proceedings (AI*IA Series) 2963
Sun SITE Central Europe, RWTH Aachen University
October 2021
Combining machine learning (ML) and computational logic (CL) is hard, mostly because of the inherently-different ways they use to represent knowledge. In fact, while ML relies on fixed-size numeric representations leveraging on vectors, matrices, or tensors of real numbers, CL relies on logic terms and clauses—which are unlimited in size and structure.
Graph neural networks (GNN) are a novelty in the ML world introduced for dealing with graph-structured data in a sub-symbolic way. In other words, GNN pave the way towards the application of ML to logic clauses and knowledge bases. However, there are several ways to encode logic knowledge into graphs: which is the best one heavily depends on the specific task at hand.
Accordingly, in this paper, we (i) elicit a number of problems from the field of CL that may benefit from many graph-related problems where GNN has been proved effective; (ii) exemplify the application of GNN to logic theories via an end-to-end toy example, to demonstrate the many intricacies hidden behind the technique; (iii) discuss the possible future directions of the application of GNN to CL in general, pointing out opportunities and open issues.
keywords
Graph Neural Networks, Machine Learning, Embedding, Computational Logic
reference talk
origin event
container publication
funding project
EXPECTATION — Personalized Explainable Artificial Intelligence for decentralized agents with heterogeneous knowledge
(01/04/2021–31/03/2024)
StairwAI — Stairway to AI: Ease the Engagement of Low-Tech users to the AI-on-Demand platform through AI
(01/01/2021–31/12/2023)
works as
reference publication for talk