Skip to content

Commit 054eca0

Browse files
AmberLJCmosharaf
andauthored
fix fedtrans paper (#254)
* fix fedtrans paper * Update FedTrans authors list --------- Co-authored-by: Mosharaf Chowdhury <mosharaf@users.noreply.github.com>
1 parent c75d72d commit 054eca0

File tree

1 file changed

+2
-1
lines changed

1 file changed

+2
-1
lines changed

source/_data/SymbioticLab.bib

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1724,12 +1724,13 @@ @Article{crosslayer-energy:arxiv24
17241724
}
17251725
17261726
@InProceedings{fedtrans:mlsys24,
1727-
author = {Yuxuan Zhu and Jiachen Liu and Fan Lai and Mosharaf Chowdhury},
1727+
author = {Yuxuan Zhu and Jiachen Liu and Mosharaf Chowdhury and Fan Lai},
17281728
booktitle = {MLSys},
17291729
title = {{FedTrans}: Efficient Federated Learning via Multi-Model Transformation},
17301730
year = {2024},
17311731
publist_confkey = {MLSys'24},
17321732
publist_topic = {Systems + AI},
1733+
publist_link = {paper || fedtrans-mlsys24.pdf},
17331734
publist_topic = {Wide-Area Computing},
17341735
publist_abstract = {
17351736
Federated learning (FL) aims to train machine learning (ML) models across potentially millions of edge client devices. Yet, training and customizing models for FL clients is notoriously challenging due to the heterogeneity of client data, device capabilities, and the massive scale of clients, making individualized model exploration prohibitively expensive. State-of-the-art FL solutions personalize a globally trained model or concurrently train multiple models, but they often incur suboptimal model accuracy and huge training costs.

0 commit comments

Comments
 (0)