Toggle Main Menu Toggle Search

Open Access padlockePrints

Privacy-Preserving Data Deduplication for Enhancing Federated Learning of Language Models

Lookup NU author(s): Dr Aydin AbadiORCiD

Downloads


Licence

This work is licensed under a Creative Commons Attribution 4.0 International License (CC BY 4.0).


Abstract

Deduplication is a vital preprocessing step that enhances machine learning model performance and saves training time and energy. However, enhancing federated learning through deduplication poses challenges, especially regarding scalability and potential privacy violations if deduplication involves sharingall clients’ data. In this paper, we address the problem of deduplication in a federated setup by introducing a pioneeringprotocol, Efficient Privacy-Preserving Multi-Party Deduplication (EP-MPD). It efficiently removes duplicates from multiple clients’ datasets without compromising data privacy. EP-MPD is con- structed in a modular fashion, utilizing two novel variants of the Private Set Intersection protocol. Our extensive experiments demonstrate the significant benefits of deduplication in federated learning of large language models. For instance, we observe up to 19.62% improvement in perplexity and up to 27.95% reduction in running time while varying the duplication level between 10% and 30%. EP-MPD effectively balances privacy and performance in federated learning, making it a valuable solution for large-scale applications.


Publication metadata

Author(s): Abadi A, Dasu VA, Sarkar S

Publication type: Conference Proceedings (inc. Abstract)

Publication status: Published

Conference Name: Distributed System Security Symposium (NDSS)

Year of Conference: 2025

Online publication date: 19/02/2025

Acceptance date: 07/02/2025

Date deposited: 08/01/2025

URL: https://www.ndss-symposium.org/ndss-paper/privacy-preserving-data-deduplication-for-enhancing-federated-learning-of-language-models/

ePrints DOI: 10.57711/w6t1-c442


Share