Encyclopedia > Linear Algebra

  Article Content

Linear algebra

Redirected from Linear Algebra

Linear Algebra the branch of mathematics concerned with the study of vectors, vector spaces, linear transformations, and systems of linear equations. Vector spaces are a central theme in modern mathematics; thus, linear algebra is widely used in both abstract algebra and functional analysis. Linear algebra also has a concrete representation in analytic geometry. It has extensive applications in the natural sciences and the social sciences.

Linear algebra had its beginnings in the study of vectors in Cartesian 2-space and 3-space. A vector, here, is a directed line segment, characterized by both length or magnitude and direction. Vectors can be used then to represent certain physical entities such as forces, and they can be added and multiplied with scalars, thus forming the first example of a real vector space.

Linear algebra today has been extended to consider n-space, since most of the useful results from 2 and 3-space can be extended to n-dimensional space. Although many people cannot easily visualize vectors in n-space, such vectors or n-tuples are useful in representing data. Since vectors, as n-tuples, are ordered lists of n components, most people can summarize and manipulate data efficiently in this framework. For example, in economics, one can create and use, say, 8-dimensional vectors or 8-tuples to represent the Gross National Product of 8 countries. One can decide to display the GNP of 8 countries for a particular year, where the countries' order is specified, for example, (United States, United Kingdom, France, Germany, Spain, India, Japan, Australia), by using a vector (v1, v2, v3, v4, v5, v6, v7, v8) where each country's GNP is in its respective position.

A vector space (or linear space), as a purely abstract concept about which we prove theorems, is part of abstract algebra, and well integrated into this field. Some striking examples of this are the group of invertible linear maps or matrices, and the ring of linear maps of a vector space. Linear algebra also plays an important part in analysis, notably, in the description of higher order derivatives in vector analysis and the study of tensor products and alternating maps.

A vector space is defined over a field, such as the field of real numbers or the field of complex numbers. Linear operators take elements from a linear space to another (or to itself), in a manner that is compatible with the addition and scalar multiplication given on the vector space(s). The set of all such transformations is itself a vector space. If a basis for a vector space is fixed, every linear transform can be represented by a table of numbers called a matrix. The detailed study of the properties of and algorithms acting on matrices, including determinants and eigenvectors, is considered to be part of linear algebra.

But, to begin at the beginning, one has to define some "elementary" objects and properties on which linear algebra is built and look at some examples. Included here are:

Some important topics:

Importance of linear algebra

One can say quite simply that the linear problems of mathematics - those that exhibit linearity in their behaviour - are those most likely to be solved. For example differential calculus does a great deal with linear approximation to functions. The difference from non-linear problems is very important in practice.

The general method of finding a linear way to look at a problem, expressing this in terms of linear algebra, and solving it, if need be by matrix calculations, is one of the most generally applicable in mathematics.

Where next?

Since linear algebra is a successful theory, its methods have been developed in other parts of mathematics. In module theory one replaces the field of scalars by a ring. In multilinear algebra one deals with the 'several variables' problem of mappings linear in each of a number of different variables, inevitably leading to the tensor concept. In the spectral theory of operators control of infinite-dimensional matrices is gained, by applying mathematical analysis in a theory that isn't purely algebraic. In all these cases the technical difficulties are much greater.



All Wikipedia text is available under the terms of the GNU Free Documentation License

 
  Search Encyclopedia

Search over one million articles, find something about almost anything!
 
 
  
  Featured Article
Sanskrit language

... often appears as nasalization of the the preceding vowel or as a nasal homorganic[?] to the following consonant. Vedas Sanskrit had a pitch (music) or tonal accent, ...

 
 
 
This page was created in 38.5 ms