讲座:What We Talk About When We Talk About Distributed Optimization 发布时间:2025-12-11

  • 活动时间:
  • 活动地址:
  • 主讲人:

题 目:What We Talk About When We Talk About Distributed Optimization

嘉 宾:Junhui Zhang, Ph.D. Candidate, MIT Operations Research Center

主持人:唐卓栋 助理教授 上海交通大学安泰经济与管理学院

时 间:2025年12月18日(周四)14:00-15:30

地 点:安泰楼A507室

内容简介:

We propose two first-order methods for convex, non-smooth, distributed optimization problems, hereafter called Multi-Timescale Gradient Sliding (MT-GS) and its accelerated variant (AMT-GS). Our MT-GS and AMT-GS can take advantage of similarities between (local) objectives to reduce the communication rounds, are flexible so that different subsets (of agents) can communicate at different, user-picked rates, and are fully deterministic. These three desirable features are achieved through a block-decomposable primal-dual formulation, and a multi-timescale variant of the sliding method introduced in Lan et al. (2020), Lan (2016), where different dual blocks are updated at potentially different rates.

To find an ϵ-suboptimal solution, the complexities of our algorithms achieve optimal dependency on ϵ: MT-GS needs O(rA/ϵ) communication rounds and O(r/ϵ^2 ) subgradient steps for Lipchitz objectives, and AMT-GS needs O(rA/√ϵµ) communication rounds and O(r/(ϵµ)) subgradient steps if the objectives are also µ-strongly convex. Here, r measures the “average rate of updates” for dual blocks, and A measures similarities between (subgradients of) local functions. In addition, the linear dependency of communication rounds on A is optimal (Arjevani and Shamir 2015), thereby providing a positive answer to the open question whether such dependency is achievable for non-smooth objectives (Arjevani and Shamir 2015).

演讲人简介:

Junhui Zhang is a 5th year PhD student at MIT Operations Research Center, advised by Prof. Patrick Jaillet. She is broadly interested in optimization, with a focus on developing (communication, oracle, memory, etc.) efficient algorithms for large scale problems and for sequential decision making. Previously, she graduated from Columbia University in 2021 with a B.S. degree in Applied Math.

 

欢迎广大师生参加!


Baidu
map