A
dynamical system is a deterministic process in which a variable's value changes over time according to a well-defined rule which only involves the variable's current value. Dynamical systems are studied in the branch of
mathematics known as
dynamical systems and chaos theory.
A dynamical system is called discrete if time is measured in discrete steps; these are modeled as recursive relations as for instance in the logistic map
- x[n+1] = 4 x[n] (1-x[n])
where
n denotes the discrete time steps and
x is the variable changing over time. If time is measured continuously, the resulting
continuous dynamical systems are expressed as
ordinary differential equations, for instance
- dx/dt = 4 x (1-x)
where
x is the variable that changes with time
t.
The changing variable x is often a real number, but can also be a vector in Rk.
We distinguish between linear dynamical systems and nonlinear dynamical systems. In linear systems, the right-hand-side of the equation is an expression which depends linearly on x, as in
- x[n+1] = 3x[n].
If two solutions to a linear system are given, then their sum is also a solution ("superposition principle"). In general, the solutions form a
vector space, which allows to use
linear algebra and simplifies the analysis significantly. For linear continuous systems, the method of
Laplace transform can also be used to transform the differential equation into an algebraic equation.
The two examples given earlier are nonlinear systems. These are much harder to analyze and often exhibit a phenomenon known as chaos which marks complete unpredictability; see also nonlinearity.
All Wikipedia text
is available under the
terms of the GNU Free Documentation License