Navigation

Jump to main content
Research Group Numerical Mathematics (Partial Differential Equations)
Workshop Optimization on Manifolds 2019

Workshop on Optimization on Manifolds

Topic

The workshop aims to bring together researchers working in theoretical or computational aspects of optimization on manifolds.

Time and Venue

The workshop will be held on Friday, August 9th, 10:45 – 17:15 in room 705 in building Reichenhainer Straße 41 and conclude with a joint dinner the same day.

Participants

The workshop is organized by Ronny Bergmann, Glaydston de Carvalho Bento, Roland Herzog, José Vidal Núñez with support by Anne-Kristin Glanzberg.

Scientific Program

Time Speaker Title & Abstract
10:45 – 11:00 Workshop Opening
11:00 - 11:30 R. Bergmann
slides

Based on a Fenchel dual notion on Riemannian manifolds we investigate the saddle point problem related to a nonsmooth convex optimization problem. We derive a primal-dual hybrid gradient algorithm to compute the saddle point using either an exact or a linearized approach for the involved nonlinear operator. We investigate a sufficient condition for convergence of the linearized algorithm on Hadamard manifolds. Numerical examples illustrate, that on Hadamard manifolds we are on par with state of the art algorithms and on general manifolds we outperform existing approaches.

11:30 - 12:00 O. P. Ferreira
slides

In this paper, we propose non-linear conjugate gradient methods for finding critical points of vector-valued functions on Riemannian manifolds with respect to the partial order induced by a closed, convex, and pointed cone with non-empty interior. No convexity assumption is made on the objectives. The concepts of Wolfe and Zoutendjik conditions are extended for the vector-valued optimization. In particular, we show that there exist intervals of step-sizes satisfying the Wolfe-type conditions. As an application we apply the conjugate gradient method for solving a vectorial version of the facility location problem.

12:00 - 12:30 J. X. da Cruz Neto
slides

In this talk, we perform the steepest descent method for computing the Riemannian center of mass on Hadamard manifolds. To this end, we extend convergence of the method to the Hadamard setting for continuously differentiable (possibly nonconvex) functions which satisfy the Kurdyka-Lojasiewicz property. Some numerical experiments computing \(L^1\) and \(L^2\) center of mass in the context of positive definite symmetric matrices are presented using two different stepsize rules.

12:30 - 14:00 Lunch
@Nomad
14:00 - 14:30 S. Z. Németh
slides

This talk is about spherical quasi-convexity of quadratic functions on spherically self-dual sets. Sufficient conditions for spherical quasi-convexity on self-dual sets are presented. A partial characterization of spherical quasi-convexity on spherical Lorentz sets is given and some examples are provided.

14:30 - 15:00 J. C. de Oliveira Souza
slides

In this work, we consider a proximal-type method for solving minimization problems involving smooth DC (difference of convex) functions in Hadamard manifolds. The method accelerates the convergence of the classical proximal point method for DC functions and, in particular, convex functions. We prove that the point which solves the proximal subproblem can be used to define a descent direction of the objective function. Our method uses such a direction together with a line search for finding critical points of the objective function. We illustrate our results with some numerical experiments.

The talk is based on joint work with Yldenilson Almeida, Paulo Roberto Oliveira, João Xavier da Cruz Neto.

15:00 - 15:30 G. de Carvalho Bento
slides

The iteration complexity analysis of some optimization methods in the Riemannian setting have been presented in the literature in the last years. In this talk we will discuss the iteration complexity of the Douglas-Rachford method (DRM) applied to minimization problems where objetive function is \(H(x):=f(x)+g(x)\), \(x\in \mathcal M\), where \(\mathcal M\) is a Hadamard manifold and \(f,g\colon\mathcal M\to\mathbb{R}\cup\{+\infty\}\) are convex functions. A convergence proof was recently presented by Bergmann, Persch, Steidl (SIAM J. Imaging Sciences 9, 2016, pp. 901-937) for minimizing ROF-like functionals on Images with values in symmetric Hadamard manifolds. We base our analysis on their work.

15:30 - 16:00 Coffee break
16:00 - 16:30 M. Silva Louzeiro
slides

The subgradient method for convex optimization problems on complete Riemannian manifolds with lower bounded sectional curvature is analysed in this talk. Iteration-complexity bounds of the subgradient method with exogenous step-size and Polyak’s step-size are stablished, completing and improving recent results on the subject.

16:30 - 17:00 R. Herzog
slides

We formulate Karush-Kuhn-Tucker (KKT) conditions for equality and inequality constrained optimization problems on smooth manifolds. Under the Guignard constraint qualification, local minimizers are shown to admit Lagrange multipliers. We also investigate other constraint qualifications and provide results parallel to those in Euclidean space. Illustrating numerical examples will be presented.

17:00 - 17:15 Closing
20:00 Walk from the Hotel to the City Center
20:30 Workshop Dinner
@Restaurant Brazil