liu.seSearch for publications in DiVA
Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • oxford
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Effects of Quantization on Federated Learning with Local Differential Privacy
Tech Univ Dresden, Germany.
Linköping University, Department of Electrical Engineering, Information Coding. Linköping University, Faculty of Science & Engineering.ORCID iD: 0000-0002-0313-7788
Tech Univ Dresden, Germany.
2022 (English)In: 2022 IEEE GLOBAL COMMUNICATIONS CONFERENCE (GLOBECOM 2022), IEEE , 2022, p. 921-926Conference paper, Published paper (Refereed)
Abstract [en]

Federated learning (FL) enables large-scale machine learning with user data privacy due to its decentralized structure. However, the user data can still be inferred via the shared model updates. To strengthen the privacy, we consider FL with local differential privacy (LDP). One of the challenges in FL is its huge communication cost caused by iterative transmissions of model updates. It has been relieved by quantization in the literature, however, there have been not many works that consider its effect on LDP and the unboundedness of the randomized model updates. We propose a communication-efficient FL algorithm with LDP that uses a Gaussian mechanism followed by quantization and the Elias-gamma coding. A novel design of the algorithm guarantees LDP even after the quantization. Under the proposed algorithm, we provide a trade-off analysis of privacy and communication costs theoretically: quantization reduces the communication costs but requires a larger perturbation to enable LDP. Experimental results show that the accuracy is mostly affected by the noise from LDP mechanisms, and it becomes enhanced when the quantization error is larger. Nonetheless, our experimental results enabled LDP with a significant compression ratio and only a slight reduction of accuracy in return. Furthermore, the proposed algorithm outperforms another algorithm with a discrete Gaussian mechanism under the same privacy budget and communication costs constraints in the experiments.

Place, publisher, year, edition, pages
IEEE , 2022. p. 921-926
Series
IEEE Global Communications Conference, ISSN 2334-0983
Keywords [en]
federated learning; local differential privacy; quantization; Elias-gamma coding
National Category
Computer Sciences
Identifiers
URN: urn:nbn:se:liu:diva-193483DOI: 10.1109/GLOBECOM48099.2022.10000632ISI: 000922633500151ISBN: 9781665435406 (electronic)ISBN: 9781665435413 (print)OAI: oai:DiVA.org:liu-193483DiVA, id: diva2:1755699
Conference
IEEE Global Communications Conference (GLOBECOM), Rio de Janeiro, BRAZIL, dec 04-08, 2022
Available from: 2023-05-09 Created: 2023-05-09 Last updated: 2023-05-09

Open Access in DiVA

No full text in DiVA

Other links

Publisher's full text

Search in DiVA

By author/editor
Günlü, Onur
By organisation
Information CodingFaculty of Science & Engineering
Computer Sciences

Search outside of DiVA

GoogleGoogle Scholar

doi
isbn
urn-nbn

Altmetric score

doi
isbn
urn-nbn
Total: 33 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • oxford
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf