BEGIN:VCALENDAR
VERSION:2.0
PRODID:researchseminars.org
CALSCALE:GREGORIAN
X-WR-CALNAME:researchseminars.org
BEGIN:VEVENT
SUMMARY:Ahmed Khaled (Cairo University)
DTSTART:20200513T130000Z
DTEND:20200513T140000Z
DTSTAMP:20260404T111101Z
UID:FLOW/1
DESCRIPTION:Title: <a href="https://stable.researchseminars.org/talk/FLOW/
 1/">On the Convergence of Local SGD on Identical and Heterogeneous Data</a
 >\nby Ahmed Khaled (Cairo University) as part of Federated Learning One Wo
 rld Seminar\n\nAbstract: TBA\n
LOCATION:https://stable.researchseminars.org/talk/FLOW/1/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Blake Woodworth (TTIC)
DTSTART:20200520T130000Z
DTEND:20200520T140000Z
DTSTAMP:20260404T111101Z
UID:FLOW/2
DESCRIPTION:Title: <a href="https://stable.researchseminars.org/talk/FLOW/
 2/">Is local SGD better than minibatch SGD?</a>\nby Blake Woodworth (TTIC)
  as part of Federated Learning One World Seminar\n\nAbstract: TBA\n
LOCATION:https://stable.researchseminars.org/talk/FLOW/2/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Dimitris Papailiopoulos (University of Wisconsin-Madison)
DTSTART:20200527T130000Z
DTEND:20200527T140000Z
DTSTAMP:20260404T111101Z
UID:FLOW/3
DESCRIPTION:Title: <a href="https://stable.researchseminars.org/talk/FLOW/
 3/">Robustness in federated learning may be impossible without an all-know
 ing central authority</a>\nby Dimitris Papailiopoulos (University of Wisco
 nsin-Madison) as part of Federated Learning One World Seminar\n\nAbstract:
  TBA\n
LOCATION:https://stable.researchseminars.org/talk/FLOW/3/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Sai Praneeth Karimireddy (EPFL)
DTSTART:20200610T130000Z
DTEND:20200610T140000Z
DTSTAMP:20260404T111101Z
UID:FLOW/4
DESCRIPTION:Title: <a href="https://stable.researchseminars.org/talk/FLOW/
 4/">Stochastic controlled averaging for federated learning</a>\nby Sai Pra
 neeth Karimireddy (EPFL) as part of Federated Learning One World Seminar\n
 \nAbstract: TBA\n
LOCATION:https://stable.researchseminars.org/talk/FLOW/4/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Filip Hanzely (KAUST)
DTSTART:20200617T130000Z
DTEND:20200617T140000Z
DTSTAMP:20260404T111101Z
UID:FLOW/5
DESCRIPTION:Title: <a href="https://stable.researchseminars.org/talk/FLOW/
 5/">Federated learning of a mixture of global and local models: Local SGD 
 and optimal algorithms</a>\nby Filip Hanzely (KAUST) as part of Federated 
 Learning One World Seminar\n\nAbstract: TBA\n
LOCATION:https://stable.researchseminars.org/talk/FLOW/5/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Hadrien Hendrikx (École Normale Supérieure & INRIA)
DTSTART:20200624T130000Z
DTEND:20200624T140000Z
DTSTAMP:20260404T111101Z
UID:FLOW/6
DESCRIPTION:Title: <a href="https://stable.researchseminars.org/talk/FLOW/
 6/">Statistical preconditioning for federated learning</a>\nby Hadrien Hen
 drikx (École Normale Supérieure & INRIA) as part of Federated Learning O
 ne World Seminar\n\nAbstract: TBA\n
LOCATION:https://stable.researchseminars.org/talk/FLOW/6/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Alireza Fallah (MIT)
DTSTART:20200701T130000Z
DTEND:20200701T140000Z
DTSTAMP:20260404T111101Z
UID:FLOW/7
DESCRIPTION:by Alireza Fallah (MIT) as part of Federated Learning One Worl
 d Seminar\n\nAbstract: TBA\n
LOCATION:https://stable.researchseminars.org/talk/FLOW/7/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Jakub Konečný (Google)
DTSTART:20200708T130000Z
DTEND:20200708T140000Z
DTSTAMP:20260404T111101Z
UID:FLOW/8
DESCRIPTION:Title: <a href="https://stable.researchseminars.org/talk/FLOW/
 8/">On the outsized importance of learning rates in local update methods</
 a>\nby Jakub Konečný (Google) as part of Federated Learning One World Se
 minar\n\n\nAbstract\nIn this work\, we study a family of algorithms\, whic
 h we refer to as local update methods\, that generalize many federated lea
 rning and meta-learning algorithms. We prove that for quadratic objectives
 \, local update methods perform stochastic gradient descent on a surrogate
  loss function which we exactly characterize. We show that the choice of c
 lient learning rate controls the condition number of that surrogate loss\,
  as well as the distance between the minimizers of the surrogate and true 
 loss functions. We use this theory to derive novel convergence rates for f
 ederated averaging that showcase this trade-off between the condition numb
 er of the surrogate loss and its alignment with the true loss function. We
  validate our results empirically\, showing that in communication-limited 
 settings\, proper learning rate tuning is often sufficient to reach near-o
 ptimal behavior. We also present a practical method for automatic learning
  rate decay in local update methods that helps reduce the need for learnin
 g rate tuning\, and highlight its empirical performance on a variety of ta
 sks and datasets.\n
LOCATION:https://stable.researchseminars.org/talk/FLOW/8/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Sashank Reddi (Google)
DTSTART:20200715T130000Z
DTEND:20200715T140000Z
DTSTAMP:20260404T111101Z
UID:FLOW/9
DESCRIPTION:Title: <a href="https://stable.researchseminars.org/talk/FLOW/
 9/">Adaptive federated optimization</a>\nby Sashank Reddi (Google) as part
  of Federated Learning One World Seminar\n\n\nAbstract\nFederated learning
  is a distributed machine learning paradigm in which a large number of cli
 ents coordinate with a central server to learn a model without sharing the
 ir own training data. Due to the heterogeneity of the client datasets\, st
 andard federated optimization methods such as Federated Averaging (FedAvg)
  are often difficult to tune and exhibit unfavorable convergence behavior.
  In non-federated settings\, adaptive optimization methods have had notabl
 e success in combating such issues. In this work\, we propose federated ve
 rsions of adaptive optimizers\, including Adagrad\, Adam\, and Yogi\, and 
 analyze their convergence in the presence of heterogeneous data for genera
 l nonconvex settings. Our results highlight the interplay between client h
 eterogeneity and communication efficiency. We also perform extensive exper
 iments on these methods and show that the use of adaptive optimizers can s
 ignificantly improve the performance of federated learning.\n
LOCATION:https://stable.researchseminars.org/talk/FLOW/9/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Nati Srebro (TTIC)
DTSTART:20200722T130000Z
DTEND:20200722T140000Z
DTSTAMP:20260404T111101Z
UID:FLOW/10
DESCRIPTION:Title: <a href="https://stable.researchseminars.org/talk/FLOW/
 10/">Heterogeneity and pluralism in distributed learning</a>\nby Nati Sreb
 ro (TTIC) as part of Federated Learning One World Seminar\n\nAbstract: TBA
 \n
LOCATION:https://stable.researchseminars.org/talk/FLOW/10/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Krishna Pillutla (University of Washington)
DTSTART:20200729T130000Z
DTEND:20200729T140000Z
DTSTAMP:20260404T111101Z
UID:FLOW/11
DESCRIPTION:Title: <a href="https://stable.researchseminars.org/talk/FLOW/
 11/">Robust Aggregation for Federated Learning</a>\nby Krishna Pillutla (U
 niversity of Washington) as part of Federated Learning One World Seminar\n
 \n\nAbstract\nKrishna Pillutla\, Sham M. Kakade\, Zaid Harchaoui. Robust A
 ggregation for Federated Learning\, arXiv:1912.13445\, 2019.\n\nYassine La
 guel\, Krishna Pillutla\, Jérôme Malick\, Zaid Harchaoui. Device Heterog
 eneity in Federated Learning: A Superquantile Approach\, arXiv:2002.11223\
 , 2020.\n
LOCATION:https://stable.researchseminars.org/talk/FLOW/11/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Konstantin Mishchenko (KAUST)
DTSTART:20200805T130000Z
DTEND:20200805T140000Z
DTSTAMP:20260404T111101Z
UID:FLOW/12
DESCRIPTION:Title: <a href="https://stable.researchseminars.org/talk/FLOW/
 12/">Local decentralized gradient descent with fast convergence</a>\nby Ko
 nstantin Mishchenko (KAUST) as part of Federated Learning One World Semina
 r\n\nAbstract: TBA\n
LOCATION:https://stable.researchseminars.org/talk/FLOW/12/
END:VEVENT
BEGIN:VEVENT
SUMMARY:Peter Kairouz (Google)
DTSTART:20200812T130000Z
DTEND:20200812T140000Z
DTSTAMP:20260404T111101Z
UID:FLOW/13
DESCRIPTION:Title: <a href="https://stable.researchseminars.org/talk/FLOW/
 13/">Federated analytics</a>\nby Peter Kairouz (Google) as part of Federat
 ed Learning One World Seminar\n\nAbstract: TBA\n
LOCATION:https://stable.researchseminars.org/talk/FLOW/13/
END:VEVENT
END:VCALENDAR
