Loading…
Monday, May 20 • 1:50pm - 2:10pm
Towards Taming the Resource and Data Heterogeneity in Federated Learning

Sign up or log in to save this to your schedule and see who's attending!

Machine learning model training often require data from multiple parties. However, in some cases, data owners cannot or are not willing to share their data due to legal or privacy constraints but would still like to benefit from training a model jointly with multiple parties. To this end, federated learning (FL) has emerged as an alternative way to do collaborative model training without sharing the training data. Such collaboration leads to more accurate and performant models than any party owning a partial set of all the data sources could hope to learn in isolation.

In this paper, we study the impact of resource (e.g., CPU, memory, and network resources) and data (e.g., training dataset sizes) heterogeneity on the training time of FL. Then, we discuss the research problems and their challenges involved in taming such resource and data heterogeneity in FL systems.

Speakers
ZC

Zheng Chai

George Mason University
HF

Hannan Fayyaz

York University
ZF

Zeshan Fayyaz

Ryerson University
AA

Ali Anwar

IBM Research–Almaden
YZ

Yi Zhou

IBM Research–Almaden
NB

Nathalie Baracaldo

IBM Research–Almaden
HL

Heiko Ludwig

IBM Research–Almaden
YC

Yue Cheng

George Mason University


Monday May 20, 2019 1:50pm - 2:10pm
Stevens Creek Room

Attendees (6)