Software


I am migrating to a new laptop and, as usual, this is a bit more work than I expected…

One issue I could not resolve to my satisfaction was the migration of my RSS-archives in Akregator (especially since I plan to switch to a different feed reader). Hence, I just collect in this post the posts and articles I planned to read or wanted to keep for some reason:

From nuit-blanche we have:

Form the blog of Clément Mouhot I have

And there are some articles:

And some preprints from the arxiv:

And finally there is a review on a book I planned to buy: A Mathematician Comes of Age by Steven G. Krantz.

Moreover, there is a list of posts which I marked with “Important”:

 

In my previous post I announced the draft of the paper “Infeasible-Point Subgradient Algorithm and Computational Solver Comparison for l1-Minimization” which is now available as a preprint at optimization online.

1. Fixed bugs; different results

Basically not much has changed from the draft to the preprint, however, we had to fix some bugs in our computational comparison of solvers and this changed the results. For example, {\ell^1}-magic is now a little better, especially when combined with the heuristic support evaluation (HSE) we propose in the paper. But most notable, {\ell^1}-Homotopy is not the winner anymore. This is due to the fact that we had a conceptual error in our setup. Remember, that {\ell^1}-Homotopy solves that Basis Pursuit denoising problem

\displaystyle  \min_x \frac12\|Ax-b\|_2^2 + \lambda\|x\|_1

starting with {\lambda = \|A^Tb\|_\infty} (which results in {x=0}) and decreases {\lambda} while tracking the (piecewise linear) solution path. Provable this reaches the Basis Pursuit solution for {\lambda=0} after crossing a finite number of breaks in the solution path. However, in our first experiments we used a final parameter of {\lambda = 10^{-9}}. And that was against our rules: We only considered solvers which (in theory) calculate the exact Basis Pursuit solution. Now we reran the calculations with {\lambda=0} and surprisingly the results were worse in terms of reconstruction accuracy (of course, also in terms of speed). We did not precisely found out which part of the solver is responsible for this effect, but it should have something to do with the accuracy of the inverse of the submatrix of {A^TA} which is maintained throughout the iterations.

Another surprise was that the results for {\lambda=10^{-9}} always ended with an approximate solution accuracy (about {10^{-8}}) for all test instances (no matter what size, matrix type or number of nonzeros we used). That is a surprise because there is no formula which tells you in advance how accurate the Basis Pursuit denoising for a particular {\lambda} will be (compared to the Basis Pursuit solution). Maybe an explanation lies is the common features all our test instances share: All matrix columns are normalized to unit Euclidean norm and all non-zero entries in the solutions follow the same distribution.

If you want to have a closer look on our results you can find all the data (i.e. all the running times and solution accuracies for all solvers and all instances) on our SPEAR project website, here.

By the way: Now the overall winner is CPLEX (using the dual simplex method)! So, please stop carrying the message that standard LP solvers are not good for Basis Pursuit…

2. Testset online!

With the submission of the paper, we also made our testset publicly available. You can download all our test instances the website of our SPEAR project both as Matlab .mat files or as ASCII-data (if you would like to use another language). Remember: Each instance comes with a matrix {A}, a vector {b} and a vector {x} which is guaranteed to be the unique solution of {Ax=b} with minimal one-norm. Moreover, there are instance for which the support of the solution is that large, that the minimal-one-norm solution is not necessarily the sparsest solution anymore which is also an interesting borderline case for most Basis Pursuit solvers.

3. ISAL1 online

Also, the Matlab code of ISAL1 (infeasible point subgradient algorithm for {\ell^1}) is online at the website of our SPEAR project. Check it out if you like.

L1TestPack has just been updated to version 1.1. With the help of Andreas Tillmann I enhanced this small gadget for issues related to {\ell^1} minimization. New functions are

  • Routines to directly calculate a source element for a given matrix {A} and a vector {x^\dagger}, that is, calculate a vector {y} such that

    \displaystyle  A^* y \in\partial\|x^\dagger\|_1.

    The existence of such a vector {y} ensures that the minimization problem (the Basis Pursuit problem)

    \displaystyle  \min_x \|x\|_1\ \text{ s.t. }\ Ax = Ax^\dagger

    has the unique solution {x^\dagger} (is other words: {x^\dagger} is recovered exactly). This is particularly helpful is you are interested in unique solutions for Basis pursuit without posing strong conditions which even imply {\ell^0}-{\ell^1}-equivalence.

  • Routines related to RIP constants, the ERC coefficient of Joel Tropp and the mutual coherence.
  • An implementation of the heuristic support evaluation HSE (also described in my previous post). (By the way: We were tempted to call this device “support evaluation routine” with acronym SuppER but abandoned this idea.)

Follow

Get every new post delivered to your Inbox.

Join 46 other followers