当前位置: X-MOL 学术IEEE Trans. Signal Process. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
DISH: A Distributed Hybrid Optimization Method Leveraging System Heterogeneity
IEEE Transactions on Signal Processing ( IF 4.6 ) Pub Date : 2024-08-26 , DOI: 10.1109/tsp.2024.3450351
Xiaochun Niu 1 , Ermin Wei 2
Affiliation  

We study distributed optimization problems over multi-agent networks, including consensus and network flow problems. Existing distributed methods neglect the heterogeneity among agents’ computational capabilities, limiting their effectiveness. To address this, we propose DISH, a dis tributed h ybrid method that leverages system heterogeneity. DISH allows agents with higher computational capabilities or lower computational costs to perform local Newton-type updates while others adopt simpler gradient-type updates. Notably, DISH covers existing methods like EXTRA, DIGing, and ESOM-0 as special cases. To analyze DISH's performance with general update directions, we formulate distributed problems as minimax problems and introduce GRAND ( g radient- r elated a scent a n d d escent) and its alternating version, Alt-GRAND, for solving these problems. GRAND generalizes DISH to centralized minimax settings, accommodating various descent ascent update directions, including gradient-type, Newton-type, scaled gradient, and other general directions, within acute angles to the partial gradients. Theoretical analysis establishes global sublinear and linear convergence rates for GRAND and Alt-GRAND in strongly-convex-nonconcave and strongly-convex-PL settings, providing linear rates for DISH. In addition, we derive the local superlinear convergence of Newton-based variations of GRAND in centralized settings to show the potentials and limitations of Newton's method in distributed settings. Numerical experiments validate the effectiveness of our methods.

中文翻译:


DISH:一种利用系统异构性的分布式混合优化方法



我们研究多代理网络上的分布式优化问题,包括共识和网络流问题。现有的分布式方法忽略了代理计算能力之间的异质性,限制了它们的有效性。为了解决这个问题,我们提出了 DISH,一种利用系统异构性的分布式混合方法。 DISH 允许具有较高计算能力或较低计算成本的智能体执行局部牛顿型更新,而其他智能体则采用更简单的梯度型更新。值得注意的是,DISH 涵盖了 EXTRA、DIGing 和 ESOM-0 等现有方法作为特殊情况。为了分析 DISH 在一般更新方向上的性能,我们将分布式问题表述为极小极大问题,并引入 GRAND(与香味和下降相关的梯度)及其替代版本 Alt-GRAND 来解决这些问题。 GRAND 将 DISH 推广到集中式极小极大设置,在与部分梯度成锐角的范围内适应各种下降上升更新方向,包括梯度型、牛顿型、缩放梯度和其他一般方向。理论分析建立了 GRAND 和 Alt-GRAND 在强凸非凹和强凸 PL 设置中的全局亚线性和线性收敛速率,为 DISH 提供线性速率。此外,我们推导了集中式设置中基于牛顿的 GRAND 变体的局部超线性收敛性,以显示牛顿方法在分布式设置中的潜力和局限性。数值实验验证了我们方法的有效性。
更新日期:2024-08-26
down
wechat
bug