Encyclopedia > Orthogonal matrix

  Article Content

Orthogonal matrix

In linear algebra, an orthogonal matrix is a square matrix G whose transpose is its inverse, i.e.,

GGT = GTG = In.</math>

This definition can be given for matrices with entries from any field, but the most common case is the one of matrices with real entries, and only that case will be considered in the rest of this article.

A real square matrix is orthogonal if and only if its columns form an orthonormal basis of Rn with the ordinary Euclidean dot product, which is the case if and only if its rows form an orthonormal basis of Rn.

Geometrically, orthogonal matrices describe linear transformations of Rn which preserve angles and lengths, such as rotations and reflections. They are compatible with the Euclidean inner product in the following sense: if G is orthogonal and x and y are vectors in Rn, then

<Gx, Gy> = <x, y>.
Conversely, if V is any finite-dimensional real inner product space and f : VV is a linear map with
<f(x), f(y)> = <x, y>
for all elements x, y of V, then f is described by an orthogonal matrix with respect to any orthonormal basis of V.

The inverse of every orthogonal matrix is again orthogonal, as is the matrix product of two orthogonal matrices. This shows that the set of all n-by-n orthogonal matrices forms a group. It is a Lie group of dimension n(n+1)/2 and is called the orthogonal group, denoted by O(n).

The determinant of any orthogonal matrix is 1 or -1. That can be shown as follows:

1 = det(I) = det(GGT) = det(G) det(GT) = (det(G))2.
In three dimesions, the orthogonal matrices with determinant 1 correspond to proper rotations and those with determinant -1 to improper rotations. The set of all orthogonal matrices whose determinant is 1 is a subgroup of O(n) of index 2, the special orthogonal group SO(n).

All eigenvalues of an orthogonal matrix, even the complex ones, have absolute value 1. Eigenvectors for different eigenvalues are orthogonal.

If Q is orthogonal, then one can always find an orthogonal matrix P such that

              /R1                 \ 
              |   R2              |
              |     ...           |
     P-1QP  = |        Rk          |
              |           ±1      |
              |             ...   |
              \                ±1 /

where the matrices R1,...,Rk are 2-by-2 rotation matrices. Intuitively, this result means that every orthogonal matrix describes a combination of rotations and reflections. The matrices R1,...,Rk correspond to the non-real eigenvalues of Q.

If A is an arbitrary m-by-n matrix of rank n, we can always write

          / R \
  A  = Q  \ 0 /

where Q is an orthogonal matrix m-by-m matrix and R is an upper triangular n-by-n matrix with positive main diagonal entries. This is known as a QR decomposition of A and can be proven by applying the Gram-Schmidt Process to the columns of A. It is useful for numerically solving systems of linear equations and least squares problems.

The complex analog to orthogonal matrices are the unitary matrices.



All Wikipedia text is available under the terms of the GNU Free Documentation License

 
  Search Encyclopedia

Search over one million articles, find something about almost anything!
 
 
  
  Featured Article
Islandia, New York

... Males have a median income of $46,083 versus $34,261 for females. The per capita income for the village is $25,682. 5.5% of the population and 4.0% of families are below ...

 
 
 
This page was created in 36.7 ms