# QR decomposition

The QR decomposition or QR factorization is a term from the mathematical sciences of linear algebra and numerical analysis. It refers to the decomposition of a matrix into the product

Two other arrays, where Q is an orthogonal () and unitary matrix () and R is an upper triangular matrix.

Such a decomposition always exists and can be calculated using different algorithms. The best known of these are

- Householder transformations
- Givens rotations
- Gram- Schmidt's orthogonalization.

The latter is commonly used in linear algebra, however, is numerically unstable. You can expand and stabilize the numerical method but.

## Definition

A matrix has an (almost - see below) clearly reduced QR decomposition

As a product in a direction orthogonal to the column matrix and an upper triangular matrix

This solution can be extended to a complete QR decomposition

By extending with additional orthogonal columns of a square matrix, and appending zeros to the bottom, so that a matrix is formed:

The QR decomposition is unique for and if you claim the signs of the diagonal elements of - is usually chosen all positive.

## Application

The QR decomposition plays in many methods of numerical mathematics an important role, for example, an orthogonal or unitary basis to determine or to linear least squares problems to deal with. It is an integral part of the QR algorithm for computing all the eigenvalues of a matrix.

### Regular solution of overdetermined systems of equations or

To the solution of a linear system of equations with matrix

For this is an alternative to the LU decomposition, it has the double burden of LU decomposition, but may be numerically more stable. In case there are more equations in the equation system variables and x as a solution to the balance problem by the least squares method (see also the regression analysis ):

In this case, the Moore -Penrose pseudo- inverse of A and for the calculated least-squares solution x is the relationship that generalizes the usual representation of the regular case may be.

### Solution of certain systems of equations

For the matrix A has a nontrivial kernel. At full rank of A, the solutions of the system of equations Ax = b, therefore, form an affine subspace. The one solution with the smallest norm is in the orthogonal complement of the core and you get them with the help of a QR decomposition of:

Again, the Moore -Penrose pseudo-inverse of A and is back for the computed solution of smallest norm applies the relationship.