Redirected from Recurrence relation
Recursion is a way of specifying a process by means of itself. More precisely (and to dispel the appearance of circularity in the definition), "complicated" instances of the process are defined in terms of "simpler" instances, and the "simplest" instances are given explicitly.
Examples of mathematical objects often defined recursively are functions and sets.
|
The canonical example of a recursively defined function is the following definition of the factorial function f(n):
Given this definition, also called a recurrence relation, we work out f(3) as follows:
f(3) = 3 · f(3-1) = 3 · f(2) = 3 · 2 · f(2-1) = 3 · 2 · f(1) = 3 · 2 · 1 · f(1-1) = 3 · 2 · 1 · f(0) = 3 · 2 · 1 · 1 = 6
Another example is the definition of Fibonacci numbers.
Here is another, perhaps simpler way to understand recursive processes:
A common method of simplification is to divide the problem into subproblems of the same type. Such a programming technique is called divide and conquer and is key to the design of many important algorithms, as well as being a fundamental part of dynamic programming.
Virtually all programming languages in use today allow the direct specification of recursive functions and procedures. When such a function is called, the computer keeps track of the various instances of the function by using a stack. Conversely, every recursive function can be transformed into an iterative function by using a stack.
Any function that can be evaluated by a computer can be expressed in terms of recursive functions, without use of iteration.
Indeed, some languages designed for logic programming and functional programming provide recursion as the only means of repetition directly available to the programmer. Such languages generally make tail recursion as efficient as iteration, letting programmers express other repetition structures (such as Scheme's map
and for
) in terms of recursion.
Recursion is deeply embedded in the theory of computation, with the theoretical equivalence of recursive functions and Turing machines at the foundation of ideas about the universality of the modern computer.
John McCarthy's function, McCarthy's 91 is another example of a recursive function.
In set theory, this is a theorem guaranteeing that recursively defined functions exist. Given a set X, an element a of X and a function f : X -> X, the theorem states that there is a unique function F : N -> X (where N denotes the set of natural numbers) such that
Take two functions f and g of domain N and codomain A such that:
where a is an element of A. We want to prove that f = g. Two functions are equal if they:
Prove of Existence We define F on zero, then one and so on. Define F(0) = a; Now assuming that F has been defined on all numbers less than equal to n we define F(N+1) = f( F(N) ). Note that since F(N) has been defined by hypothesis and that f is a function X to X so the right hand side is indeed a member of X and so this a correct definition. Alternatively, and informally, F(N) is the result of applying f to a N times.
The canonical example of a recursively defined set is the natural numbers:
The natural numbers can be defined as the smallest set satisfying these two properties.
Another interesting example is the set of all true propositions in an axiomatic system.
(It needs to be pointed out that determining whether a certain object is in a recursively defined set is not an algorithmic task.)
List of recurrence relations or algorithms
Further Readings and References Richard Johnsonbaugh, Discrete Mathematics[?] 5th edition. 1990 Macmillan
Search Encyclopedia
|
Featured Article
|