Algorithm

Lanczos Algorithm

This is the third post in my series on Krylov subspaces. The first post is here and the second one is here. The Lanczos Algorithm In this post we cover the Lanczos algorithm that gives you eigenvalues and eigenvectors of symmetric matrices.

Arnoldi Iterations

This is the second post in my series on Krylov subspaces. The first post is here and the third one is here. Arnoldi Iterations Arnoldi iterations is an algorithm to find eigenvalues and eigenvectors of general matrices.

Designing Neural Networks in Mathematica

Implementing LeNet for MNist in Wolfram Mathematica

Variations of the Bregman Algorithm (4/4)

In the previous post in our series on the Bregman algorithm we discussed how to solve convex optimization problems. In this post we want to give reference to some variations and extensions of the Bregman algorithm.

The Bregman Algorithm (3/4)

In a previous post we discussed how to solve constrained optimization problems by using the Bregman algorithm. Here we want to extend the approach unconstrained problems. Let’s start simple. Assume we want to minimize a convex and smooth function $f\colon\mathbb{R}^{n}\to\mathbb{R}$.

The Bregman Algorithm (2/4)

In a previous post we discussed how to find a common point in a family of convex sets by using the Bregman algorithm. Actually the algorithm is capable of more. We can use it to solve constrained optimization problems.

The Bregman Algorithm (1/4)

In the 1960s Lev Meerovich Bregman developed an optimization algorithm [1] which became rather popular beginning of 2000s. It’s not my intention to present the proofs for all the algorithmic finesse, but rather the general ideas why it is so appealing.

QR Decompositions and Rank Deficient Matrices

We discuss the necessary changes to our QR decomposition algorithms to handle matrices which do not have full rank.

QR Comparison with other Implementations

We developed a QR decomposition algorithm, based on the orthogonalisation process of Gram-Schmidt in a series of posts here, here, here, and here. Let’s have a look how good this algorithm performs against built-in implementations from julia and other programming languages.

QR Decompositions with Reorthogonalisation

Problem Formulation We already discussed QR decompositions and showed that using the modified formulation of Gram-Schmidt significantly improves the accuracy of the results. However, there is still an error of about $10^3 M_\varepsilon$ (where $M_\varepsilon$ is the machine epsilon) when using the modified Gram Schmidt as base algorithm for the orthogonalisation.