標題: Distributed Sub-gradient Algorithms with Limited Communications
作者: Rini, Stefano
Rao, Milind
Goldsmith, Andrea
交大名義發表
National Chiao Tung University
關鍵字: Distributed convex optimization;Decentralized methods;Gradient descent methods;Dual averaging;Nesterov gradient acceleration;Random projections
公開日期: 1-一月-2019
摘要: We consider the distributed convex optimization scenario in which nodes in a network collectively find the minimum of a function utilizing only local communications and computations. Various sub-gradient algorithms have been developed for this optimization setting for the case in which the global function factorizes as the sum of local functions to be distributed to the nodes in network, including the distributed (i) online gradient descent, (ii) Nesterov gradient descent, and (iii) dual averaging algorithms. Generally speaking, these subgradient algorithms assume that, in each communication round, nodes can exchange messages of size comparable to that of the optimization variable. For many high-dimensional optimization problems, this communication requirement is beyond the capabilities of the communication network supporting the distributed optimization. For this reason, we propose a dimensionality reduction technique based on random projections to adapt these distributed optimization algorithms to networks with communication links of limited capacity. In particular, we analyze the performance of the proposed dimensionality reduction technique for the three algorithms above, i.e. (i)-(iii). Numerical simulations are presented that illustrate the performance of the proposed approach for these three algorithms.
URI: http://hdl.handle.net/11536/155066
ISBN: 978-1-7281-4300-2
ISSN: 1058-6393
期刊: CONFERENCE RECORD OF THE 2019 FIFTY-THIRD ASILOMAR CONFERENCE ON SIGNALS, SYSTEMS & COMPUTERS
起始頁: 2171
結束頁: 2175
顯示於類別:會議論文