### Jorge Cortés

#### Professor

Distributed subgradient methods for saddle-point problems

D. Mateos-Nuñez, J. Cortés

Proceedings of the IEEE Conference on
Decision and Control, Osaka, Japan, 2015, pp. 5462-5467

### Abstract

We present provably correct distributed subgradient methods for
general min-max problems with agreement constraints on a subset of
the arguments of both the convex and concave parts. Applications
include separable constrained minimization problems where each
constraint is a sum of convex functions of the local variables. The
proposed algorithm then reduces to primal-dual updates using local
subgradients and Laplacian averaging on local copies of the
multipliers associated to the global constraints. The framework
also encodes minimization problems with semidefinite constraints,
which results in novel distributed strategies that are scalable if
the order of the matrix inequalities is independent of the size of
the network. As a consequence of our analysis, and for the case of general
convex-concave functions, we establish the convergence of the
running time-averages of the local estimates to a saddle point under
periodic connectivity of the communication graphs. Specifically,
choosing the gradient step-sizes in a suitable way, we show that the
evaluation error is proportional to 1/\sqrt{t}, where t is the
iteration step.

pdf |
ps.gz

Mechanical and Aerospace Engineering,
University of California, San Diego

9500 Gilman Dr,
La Jolla, California, 92093-0411

Ph: 1-858-822-7930

Fax: 1-858-822-3107

cortes at ucsd.edu

Skype id:
jorgilliyo