Redirected from Dynamical systems
An important goal is to describe the fixed points, or steady states of a given dynamical systems; these are values of the variable which won't change over time. Some of these fixed points are attractive, meaning that if the system starts out in a nearby state, it will converge towards the fixed point.
Similarly, one is interested in periodic points, states of the system which repeat themselves after several timesteps. Periodic points can also be attractive. Sarkovskii's theorem is an interesting statement about the number of periodic points of a onedimensional discrete dynamical system.
Even simple nonlinear dynamical systems often exhibit almost random, completely unpredictable behavior that has been called chaos. The branch of dynamical systems which deals with the clean definition and investigation of chaos is called chaos theory.
States:
1: Simple reason 2: More complicated things are added 3: You realise that the complicated parts keep repeating many times forming a simplicity 4: But it stops doing that forming a complicated path. 5: But then again look simple
And perhaps so on ad infitem.
Examples
Search Encyclopedia

Featured Article
