loading
Papers Papers/2022 Papers Papers/2022

Research.Publish.Connect.

Paper

Authors: Filippo Galli 1 ; Sayan Biswas 2 ; 3 ; Kangsoo Jung 2 ; Tommaso Cucinotta 4 and Catuscia Palamidessi 2 ; 3

Affiliations: 1 Scuola Normale Superiore, Pisa, Italy ; 2 INRIA, Palaiseau, France ; 3 LIX,École Polytechnique, Palaiseau, France ; 4 Scuola Superiore Sant’Anna, Pisa, Italy

Keyword(s): Federated Learning, Differential Privacy, d-Privacy, Personalized Models.

Abstract: Federated learning (FL) is a particular type of distributed, collaborative machine learning, where participating clients process their data locally, sharing only updates of the training process. Generally, the goal is the privacy-aware optimization of a statistical model’s parameters by minimizing a cost function of a collection of datasets which are stored locally by a set of clients. This process exposes the clients to two issues: leakage of private information and lack of personalization of the model. To mitigate the former, differential privacy and its variants serve as a standard for providing formal privacy guarantees. But often the clients represent very heterogeneous communities and hold data which are very diverse. Therefore, aligned with the recent focus of the FL community to build a framework of personalized models for the users representing their diversity, it is of utmost importance to protect the clients’ sensitive and personal information against potential threats. To address this goal we consider d-privacy, also known as metric privacy, which is a variant of local differential privacy, using a metric-based obfuscation technique that preserves the topological distribution of the original data. To cope with the issues of protecting the privacy of the clients and allowing for personalized model training, we propose a method to provide group privacy guarantees exploiting some key properties of d-privacy which enables personalized models under the framework of FL. We provide theoretical justifications to the applicability and experimental validation on real-world datasets to illustrate the working of the proposed method. (More)

CC BY-NC-ND 4.0

Sign In Guest: Register as new SciTePress user now for free.

Sign In SciTePress user: please login.

PDF ImageMy Papers

You are not signed in, therefore limits apply to your IP address 18.119.131.72

In the current month:
Recent papers: 100 available of 100 total
2+ years older papers: 200 available of 200 total

Paper citation in several formats:
Galli, F.; Biswas, S.; Jung, K.; Cucinotta, T. and Palamidessi, C. (2023). Group Privacy for Personalized Federated Learning. In Proceedings of the 9th International Conference on Information Systems Security and Privacy - ICISSP; ISBN 978-989-758-624-8; ISSN 2184-4356, SciTePress, pages 252-263. DOI: 10.5220/0011885000003405

@conference{icissp23,
author={Filippo Galli. and Sayan Biswas. and Kangsoo Jung. and Tommaso Cucinotta. and Catuscia Palamidessi.},
title={Group Privacy for Personalized Federated Learning},
booktitle={Proceedings of the 9th International Conference on Information Systems Security and Privacy - ICISSP},
year={2023},
pages={252-263},
publisher={SciTePress},
organization={INSTICC},
doi={10.5220/0011885000003405},
isbn={978-989-758-624-8},
issn={2184-4356},
}

TY - CONF

JO - Proceedings of the 9th International Conference on Information Systems Security and Privacy - ICISSP
TI - Group Privacy for Personalized Federated Learning
SN - 978-989-758-624-8
IS - 2184-4356
AU - Galli, F.
AU - Biswas, S.
AU - Jung, K.
AU - Cucinotta, T.
AU - Palamidessi, C.
PY - 2023
SP - 252
EP - 263
DO - 10.5220/0011885000003405
PB - SciTePress