Orthogonal matrix

An orthogonal matrix is a square in linear algebra, real matrix whose row and column vectors pairs are orthonormal to each other. Thus, the inverse of an orthogonal matrix is also its transpose. The set of orthogonal matrices of fixed size forms with the matrix multiplication as linking the orthogonal group. Orthogonal Matrices provide Kongruenzabbildungen in Euclidean space, ie rotations, reflections, and combinations thereof, Represent example, be used in the numerical solution of linear systems or eigenvalue problems. The analog concept in complex matrices is the unitary matrix.

  • 3.1 Inverse
  • 3.2 Linear and angular loyalty
  • 3.3 determinant
  • 3.4 Eigenvalues
  • 3.5 diagonalizability
  • 3.6 standards
  • 5.1 Systems of Linear Equations
  • 5.2 Matrix decompositions

Definition

A real square matrix is called orthogonal if the product gives the identity matrix with its transposed matrix, ie

Applies. If the row vectors of the matrix denoted by, then this condition is equivalent to saying that the standard scalar product of two row vectors

Results, the Kronecker delta is. The row vectors of an orthogonal matrix thus form an orthonormal basis of the coordinate space. This also applies to the column vectors of an orthogonal matrix, because is also orthogonal, i.e.

Even if the name could be " orthogonal matrix " is understood it is not sufficient if the row or column vectors are only pairwise orthogonal; they must also be normalized, ie have length one.

Examples

Concrete examples

  • The matrix
  • The matrix

General examples

  • Permutation matrices, ie, matrices in which exactly one entry per row and column is the same and all other entries are equal, are orthogonal. Identifies the associated permutation to a permutation matrix, then applies
  • Rotation matrices, ie matrices which describe a rotation around the origin in the Euclidean plane, are orthogonal. Refers to
  • Spiegelungsmatrizen, ie matrices which describe a (vertical ) reflection in a line through the origin in the Euclidean plane, are orthogonal. Refers to

Properties

Inverse

An orthogonal matrix is always invertible, and its inverse equals its transpose, that is

The inverse of a matrix is in fact the very one matrix for which

Applies. Furthermore follows from the second equation that the transpose of an orthogonal matrix is orthogonal. It is the converse and each matrix whose transpose is equal to its inverse, is orthogonal, because it is then

Linear and angular loyalty

Is an orthogonal matrix is multiplied with a vector, the length ( Euclidean norm ) of the vector does not change, that is,

Further, the standard scalar product of two vectors is invariant with respect to multiplication by an orthogonal matrix, ie

Both properties follow it directly from the displacement property of Standardskalarprodukts. Because of this length and angle fidelity, the linear map

A congruence in Euclidean space dar. Conversely, the projection matrix of each length- or conformal linear map between two finite-dimensional real vector space orthogonal, because it is

Where the -th standard basis vector. Such a linear map is called according to orthogonal map.

Determinant

For the value of the determinant of an orthogonal matrix

What using the product theorem on determinants

Followed. Thus, only the values ​​one or minus one take the determinant of an orthogonal matrix. However, there are also non-orthogonal matrices whose determinant is plus or minus one, for example unimodular matrices. Orthogonal matrices whose determinant is one, corresponding to rotations, while orthogonal matrices whose determinant is negative one, representing rotation reflections.

Eigenvalues

The eigenvalues ​​of an orthogonal matrix are not necessarily all real. However, you have the complex amount one, ie are of the form

With. Indeed, if an eigenvector associated to, then applies due to the length loyalty and the absolute homogeneity of a standard

And therefore. An orthogonal matrix therefore has at most the real eigenvalues ​​. The complex eigenvalues ​​always occur in pairs on complex conjugate, that is, is also an eigenvalue.

Diagonalizability

An orthogonal matrix is normal, that is to say is valid

And thus unitarily diagonalizable over the complex numbers. According to the spectral theorem, there namely a unitary matrix, so that

Holds, where a diagonal matrix with the eigenvalues ​​of is. The column vectors of are then pairwise orthonormal eigenvectors of. This means that the eigenspaces of an orthogonal matrix are pairwise orthogonal.

However, in general, an orthogonal matrix is not real diagonalizable. However, there is an orthogonal matrix, so that

A block diagonal matrix obtained, in which the individual blocks are either rotation matrices of size or in the number or consist.

Standardize

The spectral norm of an orthogonal matrix

For the Frobenius norm applies accordingly with the Frobenius inner product

The product with an orthogonal matrix receives both the spectral norm, as well as the Frobenius norm of a given matrix, because it is

And

This means that the condition of a matrix with respect to these standards after multiplication by an orthogonal matrix is preserved.

Orthogonal matrices as a group

The set of regular matrices of fixed size forms with the matrix multiplication as linking a group, the general linear group. As a neutral element here is the identity matrix. The orthogonal matrices form a subgroup of the general linear group, the orthogonal group. The product of two orthogonal matrices is again orthogonal namely, because it is

Next, the inverse of an orthogonal matrix is also orthogonal, since it applies

The orthogonal matrices with determinant one, ie, the rotation matrices, in turn, form a subgroup of the orthogonal group, the rotation group (or special orthogonal group). The orthogonal matrices with determinant minus one, ie, the rotation reflections, do not form a subgroup of the orthogonal group, because they lack the neutral element, but merely incidental to class.

Use

Systems of linear equations

The solution of linear systems of equations of the form

With an orthogonal matrix and a right-hand side can be explained by numerically efficient

Calculate. The determination of the solution, thus requiring only a matrix -vector multiplication, which may be performed at a cost of the order. In comparison, the solution of general systems of linear equations requires an effort, for example by means of the Gaussian elimination. This advantage is utilized, for example, in the (real) discrete Fourier transform and the discrete cosine transform.

Matrix decompositions

A further application of orthogonal matrices is the QR decomposition of a matrix given as the product of

An orthogonal matrix and an upper triangular matrix. The structure of the matrix may of Givens rotations, corresponding to rotations or Householder transformations that reflections corresponding to be performed. QR- decompositions are used in numerical analysis in solving ill- conditioned, overdetermined or underdetermined systems of linear equations. Another field of application is the calculation of eigenvalue problems with the QR algorithm.

Using the singular value decomposition can be any real matrix as a product

An orthogonal matrix, a diagonal matrix and the transpose of another orthogonal matrix represent. The diagonal entries of the matrix are then the singular values ​​of. The singular value decomposition is used, for example, in the geometry in the main axis transform quadrics and statistics in the principal component analysis, multivariate data.

624403
de