Filters








2 Hits in 3.4 sec

QuPeD: Quantized Personalization via Distillation with Applications to Federated Learning [article]

Kaan Ozkara, Navjot Singh, Deepesh Data, Suhas Diggavi
2022 arXiv   pre-print
For personalization, we allow clients to learn compressed personalized models with different quantization parameters and model dimensions/structures.  ...  In this work, we introduce a quantized and personalized FL algorithm QuPeD that facilitates collective (personalized model compression) training via knowledge distillation (KD) among clients who have access  ...  In Theorem Personalized Quantization for FL via Knowledge Distillation We now consider the FL setting where we aim to learn quantized and personalized models for each client with different precision  ... 
arXiv:2107.13892v2 fatcat:4wuwsham7vfzvamjftxobzgo2m

A Generative Framework for Personalized Learning and Estimation: Theory, Algorithms, and Privacy [article]

Kaan Ozkara, Antonious M. Girgis, Deepesh Data, Suhas Diggavi
2022 arXiv   pre-print
We also develop privacy for personalized learning methods with guarantees for user-level privacy and composition.  ...  A distinguishing characteristic of federated learning is that the (local) client data could have statistical heterogeneity.  ...  Algorithm 6 1 : 2 :τ divides t then 3 :On Server do: 4 : 61234 Adaptive Personalization via Distillation (AdaPeD) with local fine-tuning Parameters: local variances {ψ 0 i }, personalized models {θ 0 i  ... 
arXiv:2207.01771v1 fatcat:nlokelv7vnhobnlmxfbtftbq54