In
mathematics and especially
linear algebra, an
n-by-
n matrix A is called
invertible or
non-singular if there exists another
n-by-
n matrix
B such that
- AB = BA = I_{n},
where
I_{n} denotes the
n-by-
n identity matrix and the multiplication used is ordinary
matrix multiplication. If this is the case, then the matrix
B is uniquely determined by
A and is called the
inverse of
A, denoted by
A^{−1}. A square matrix that is not invertible is called
singular. While the most common case is that of matrices over the
real or
complex numbers, all these definitions can be given for matrices over any
ring.
Invertible Matrix Theorem
Let A be a square n by n matrix over a field K (for example the field R of real numbers). The following statements are equivalent and must all be true for A to be invertible:
- A is row equivalent to the n by n identity matrix I_{n}
- A has n pivot positions
- det A ≠ 0
- rank A = n
- The equation Ax = 0 has only the trivial solution x = 0 (i.e. Nul A = {0}).
- The equation Ax = b has at most one solution for each b in K^{n}
- The equation Ax = b has at least one solution for each b in K^{n}
- The equation Ax = b has exactly one solution for each b in K^{n}
- The columns of A are linearly independent.
- The columns of A span K^{n} (i.e. Col A = K^{n})
- The columns of A form a basis of K^{n}
- The linear transformation x |-> Ax from K^{n} to K^{n} is one-to-one
- The linear transformation x |-> Ax from K^{n} to K^{n} is onto
- The linear transformation x |-> Ax from K^{n} to K^{n} is bijective
- There is an n by n matrix B such that B'A = I_{n}
- There is an n by n matrix B such that A'B = I_{n}
- The transpose A^{T} is an invertible matrix.
- The number 0 is not an eigenvalue of A
If A be a square n by n matrix over a commutative ring. Then A is invertible if and only if det A is a unit in the ring.
Further properties and facts
To check whether a given matrix is invertible, and to compute the inverse in small examples, one typically uses the Gauss-Jordan elimination algorithm. Other methods are explained under matrix inversion.
The inverse of an invertible matrix A is itself invertible, with
- (A^{−1})^{−1} = A.
The product of two invertible matrices
A and
B of the same size is again invertible, with the inverse given by
- (AB)^{−1} = B^{−1}A^{−1}
(note that the order of the factors is reversed.) As a consequence, the set of invertible
n-by-
n matrices forms a
group, known as the
general linear group Gl(
n).
As a rule of thumb, "almost all" matrices are invertible. Over the field of real numbers, this can be made precise as follows: the set of singular n-by-n matrices, considered as a subset of R^{n×n}, is a null set, i.e. has Lebesgue measure zero. Intuitively, this means that if you pick a random square matrix over the reals, the probability that it be singular is zero. The reason for this is that singular matrices can be thought of as the roots of the polynomial function given by the determinant.
A matrix with entries from some commutative ring is invertible if and only if its determinant is invertible as an element of that ring.
Generalizations
Some of the properties of inverse matrices are shared by pseudoinverses[?] which can be defined for every matrix, even for non-square ones.
All Wikipedia text
is available under the
terms of the GNU Free Documentation License