This is the fourth post in our series on Krylov subspaces. The previous ones (i.e Arnoldi Iterations and Lanczos Algorithm were mostly focused on eigenvalue and eigenvector computations. In this post we will have a look at solving strategies for linear systems of equations.
This is the third post in my series on Krylov subspaces. The first post is here and the second one is here.
The Lanczos Algorithm In this post we cover the Lanczos algorithm that gives you eigenvalues and eigenvectors of symmetric matrices.
This is the second post in my series on Krylov subspaces. The first post is here and the third one is here.
Arnoldi Iterations Arnoldi iterations is an algorithm to find eigenvalues and eigenvectors of general matrices.
This is the first post in a (planned) series on Krylov subspaces, projection processes, and related algorithms. We already discussed projection processes when talking about the Bregman algorithm and we will see that the Krylov (sub-)spaces will be generated by a set of vectors that are not necessarily orthogonal.
In the previous post in our series on the Bregman algorithm we discussed how to solve convex optimization problems. In this post we want to give reference to some variations and extensions of the Bregman algorithm.
In a previous post we discussed how to solve constrained optimization problems by using the Bregman algorithm. Here we want to extend the approach unconstrained problems.
Let’s start simple. Assume we want to minimize a convex and smooth function $f\colon\mathbb{R}^{n}\to\mathbb{R}$.
In a previous post we discussed how to find a common point in a family of convex sets by using the Bregman algorithm. Actually the algorithm is capable of more. We can use it to solve constrained optimization problems.
In the 1960s Lev Meerovich Bregman developed an optimization algorithm [1] which became rather popular beginning of 2000s. It’s not my intention to present the proofs for all the algorithmic finesse, but rather the general ideas why it is so appealing.