The Douglas-Rachford method is a method to solve a monotone inclusion with two maximally monotone operators defined on a Hilbert space . The method uses the resolvents and and produces two sequences of iterates
Looks pretty opaque to me and I did not have some good intuition where this methods comes from and why it should work. Here’s a way I can remember (and which is copied from “Preconditioned Douglas-Rachford Splitting Methods for Convex-Concave Saddle-Point Problems” by Hongpeng Sun and Kristian Bredies):
Substituting gives the optimality system
or, written differently
This is again a monotone inclusion, but now on . We introduce the positive definite operator
and perform the iteration
(This is basically the same as applying the proximal point method to the preconditioned inclusion
Writing out the iteration gives
Now, applying the Moreau identity for monotone operators (), gives
substituting finally gives Douglas-Rachford:
(besides the stepsize which we would get by starting with the equivalent inclusion in the first place).
Probably the shortest derivation of Douglas-Rachford I have seen. Oh, and also the (weak) convergence proof comes for free: It’s a proximal point iteration and you just use the result by Rockafellar from “Monotone operators and the proximal point algorithm”, SIAM J. Control and Optimization 14(5), 1976.
August 20, 2014 at 7:01 am
Hi Dirk,
This is a great way to remember DRS. However, there is a technical detail missing: The matrix $M$ is not positive definite. Indeed, for all $x \in X$, we have . Thus, convergence of DRS cannot be deduced from the paper by Rockafellar.
If and , then the metric
can be used in place of $M$. This results in the primal-dual algorithm of Chambolle and Pock (http://www.optimization-online.org/DB_FILE/2010/06/2646.pdf), and weak convergence follows by the results of Rockafellar. Thus, DRS is really a limiting case of a class of primal-dual algorithms. I believe the first paper to notice this connection is by He and Yuan: http://www.math.hkbu.edu.hk/~xmyuan/Paper/HeYuan-SIIMS-2nd.pdf.
Since the paper by He and Yuan, several other metrics have been developed. A few are summarized by Pesquet and Repetti in http://arxiv.org/pdf/1406.6404.pdf, and by myself in http://arxiv.org/pdf/1408.4419v1.pdf
Damek
August 20, 2014 at 8:35 am
Right, one needs extra work to prove convergence of DRS this way.
February 24, 2017 at 7:18 pm
[…] blogged about the Douglas-Rachford method before and in this post I’d like to dig a bit into the history of the […]