Anonymous Federated Learning via Named-Data Networking

Andrea Agiollo, Enkeleda Bardhi, Mauro Conti, Nicolò Dal Fabbro, Riccardo Lazzeretti
Future Generation Computer Systems
2023

Federated Learning (FL) represents the de-facto approach for distributed training of machine learning models. While being popular, FL faces several security and privacy issues. Here, privacy is two-folded, requiring enabling data privacy and user anonymity. While the former has attracted research efforts, the latter has yet to be deeply investigated. In this context,  there is room for linkability attacks which lead to model alteration and worker impersonation issues.
This paper addresses the user anonymity issue by proposing a novel communication framework for FL leveraging Named-Data Networking (NDN). NDN is a novel networking paradigm that decouples the data from its location, anonymising the users. In this context, NDN embodies a suitable solution to ensure privacy in FL. However, NDN's pull-based nature clashes with FL's push-based essence. Moreover, the NDN naming convention and its caching approach are shown to introduce anonymity leakage. Therefore, we design a novel NDN-based communication scheme for FL that achieves anonymity by design, defining a novel naming convention, routing, and enrollment procedures. Additionally, we thoroughly discuss the anonymity of the proposed system, taking into account a well-defined threat model in FL, showing the robustness of our approach. Finally, through simulations, we compare the proposed mechanism against IP-based FL and other state-of-the-art anonymity frameworks. The results show latency and training time improvements, especially when dealing with large models, numerous federations, and complex networks.

keywordsAnonymous communication, Federated Learning, Named Data Networking, Privacy-preserving

Partita IVA: 01131710376 — Copyright © 2008–2023 APICe@DISI – PRIVACY