Friday, September 25, 2020 -
A defining characteristic of federated learning is the presence of heterogeneity, i.e., that data and compute may differ significantly across the network. In this talk I show that the challenge of heterogeneity pervades the machine learning process in federated settings, affecting issues such as optimization, modeling, and fairness. In terms of optimization, I discuss FedProx, a distributed optimization method that offers robustness to systems and statistical heterogeneity. I then explore the role that heterogeneity plays in delivering models that are accurate and fair to all users/devices in the network. Our work here extends classical ideas in multi-task learning and alpha-fairness to large-scale heterogeneous networks, enabling flexible, accurate, and fair federated learning.
Virginia Smith is an assistant professor in the Machine Learning Department at Carnegie Mellon University, and a courtesy faculty member in the Electrical and Computer Engineering Department. Her research interests lie at the intersection of machine learning, optimization, and computer systems. A unifying theme of her research is to develop machine learning methods and theory that effectively leverage prior knowledge and account for practical constraints (e.g., hardware capabilities, network capacity, statistical structure). Specific topics include: large-scale machine learning, distributed optimization, resource-constrained learning, multi-task learning, transfer learning, and data augmentation.