Mathematics

Conjugate Gradients

This is the fourth post in our series on Krylov subspaces. The previous ones (i.e Arnoldi Iterations and Lanczos Algorithm were mostly focused on eigenvalue and eigenvector computations. In this post we will have a look at solving strategies for linear systems of equations.

Numerical Linear Algebra and Optimization

A collection of notes and investigations on numerical linear algebra and optimization related topics.

Lanczos Algorithm

This is the third post in my series on Krylov subspaces. The first post is here and the second one is here. The Lanczos Algorithm In this post we cover the Lanczos algorithm that gives you eigenvalues and eigenvectors of symmetric matrices.

Arnoldi Iterations

This is the second post in my series on Krylov subspaces. The first post is here and the third one is here. Arnoldi Iterations Arnoldi iterations is an algorithm to find eigenvalues and eigenvectors of general matrices.

Krylov Subspaces

This is the first post in a (planned) series on Krylov subspaces, projection processes, and related algorithms. We already discussed projection processes when talking about the Bregman algorithm and we will see that the Krylov (sub-)spaces will be generated by a set of vectors that are not necessarily orthogonal.

Designing Neural Networks in Mathematica

Implementing LeNet for MNist in Wolfram Mathematica

Variations of the Bregman Algorithm (4/4)

In the previous post in our series on the Bregman algorithm we discussed how to solve convex optimization problems. In this post we want to give reference to some variations and extensions of the Bregman algorithm.

The Bregman Algorithm (3/4)

In a previous post we discussed how to solve constrained optimization problems by using the Bregman algorithm. Here we want to extend the approach unconstrained problems. Let’s start simple. Assume we want to minimize a convex and smooth function $f\colon\mathbb{R}^{n}\to\mathbb{R}$.

The Bregman Algorithm (2/4)

In a previous post we discussed how to find a common point in a family of convex sets by using the Bregman algorithm. Actually the algorithm is capable of more. We can use it to solve constrained optimization problems.

The Bregman Algorithm (1/4)

In the 1960s Lev Meerovich Bregman developed an optimization algorithm [1] which became rather popular beginning of 2000s. It’s not my intention to present the proofs for all the algorithmic finesse, but rather the general ideas why it is so appealing.