GNN2GNN: Graph Neural Networks to Generate Neural Networks


Andrea Agiollo  /  Andrea Agiollo, Andrea Omicini

The success of neural networks (NNs) is tightly linked with their architectural design—a complex problem by itself. We here introduce a novel framework leveraging Graph Neural Networks to Generate Neural Networks (GNN2GNN) where powerful NN architectures can be learned out of a set of available architecture-performance pairs. GNN2GNN relies on a three-way adversarial training of GNN, to optimise a generator model capable of producing predictions about powerful NN architectures. Unlike Neural Architecture Search (NAS) techniques proposing efficient searching algorithms over a set of NN architectures, GNN2GNN relies on learning NN architectural design criteria. GNN2GNN learns to propose NN architectures in a single step – i.e., training of the generator –, overcoming the recursive approach characterising NAS. Therefore, GNN2GNN avoids the expensive and inflexible search of efficient structures typical of NAS approaches. Extensive experiments over two state-of-the-art datasets prove the strength of our framework, showing that it can generate powerful architectures with high probability. Moreover, GNN2GNN outperforms possible counterparts for generating NN architectures, and shows flexibility against dataset quality degradation. Finally, GNN2GNN paves the way towards generalisation between datasets.

Events

  • 38th Conference on Uncertainty in Artificial Intelligence (UAI 2022) — 01/08/2022–05/08/2022

Publications

  • Document “Publications.Gnn2gnnUai2022” does not contain a “Publication” object

Tags:

Talk

— speakers

— authors

— sort

talk

— language

wgb.gif

Context

38th Conference on Uncertainty in Artificial Intelligence (UAI 2022)

— where

Eindhoven, Netherlands

— when

02/08/2022

Partita IVA: 01131710376 - Copyright © 2008-2022 APICe@DISI Research Group - PRIVACY