File tree Expand file tree Collapse file tree 2 files changed +2
-2
lines changed
Expand file tree Collapse file tree 2 files changed +2
-2
lines changed Original file line number Diff line number Diff line change @@ -112,7 +112,7 @@ The origin of the **data heterogeneity** phenomenon is the characteristics of us
112112
113113 *** Knowledge-distillation-based pFL (more in [ HtFLlib] ( https://github.com/TsingZ0/HtFLlib ) )***
114114
115- - ** FedDistill (FD )** — [ Communication-Efficient On-Device Machine Learning: Federated Distillation and Augmentation under Non-IID Private Data] ( https://arxiv.org/pdf/1811.11479.pdf ) * 2018*
115+ - ** FD (FedDistill )** — [ Communication-Efficient On-Device Machine Learning: Federated Distillation and Augmentation under Non-IID Private Data] ( https://arxiv.org/pdf/1811.11479.pdf ) * 2018*
116116- ** FML** — [ Federated Mutual Learning] ( https://arxiv.org/abs/2006.16765 ) * 2020*
117117- ** FedKD** — [ Communication-efficient federated learning via knowledge distillation] ( https://www.nature.com/articles/s41467-022-29763-x ) * Nature Communications 2022*
118118- ** FedProto** — [ FedProto: Federated Prototype Learning across Heterogeneous Clients] ( https://ojs.aaai.org/index.php/AAAI/article/view/20819 ) * AAAI 2022*
Original file line number Diff line number Diff line change @@ -290,7 +290,7 @@ <h3>Personalized FL (pFL)</h3>
290290
291291 < li > < strong > < em > Knowledge-distillation-based pFL (more in < a href ="https://github.com/TsingZ0/HtFLlib "> HtFLlib</ a > )</ em > </ strong > </ li >
292292 < ul >
293- < li > < strong > FedDistill (FD )</ strong > — < a href ="https://arxiv.org/pdf/1811.11479.pdf "> Communication-Efficient On-Device Machine Learning: Federated Distillation and Augmentation under Non-IID Private Data</ a > < em > 2018</ em > </ li >
293+ < li > < strong > FD (FedDistill )</ strong > — < a href ="https://arxiv.org/pdf/1811.11479.pdf "> Communication-Efficient On-Device Machine Learning: Federated Distillation and Augmentation under Non-IID Private Data</ a > < em > 2018</ em > </ li >
294294 < li > < strong > FML</ strong > — < a href ="https://arxiv.org/abs/2006.16765 "> Federated Mutual Learning</ a > < em > 2020</ em > </ li >
295295 < li > < strong > FedKD</ strong > — < a href ="https://www.nature.com/articles/s41467-022-29763-x "> Communication-efficient federated learning via knowledge distillation</ a > < em > Nature Communications 2022</ em > </ li >
296296 < li > < strong > FedProto</ strong > — < a href ="https://ojs.aaai.org/index.php/AAAI/article/view/20819 "> FedProto: Federated Prototype Learning across Heterogeneous Clients</ a > < em > AAAI 2022</ em > </ li >
You can’t perform that action at this time.
0 commit comments