## Encyclopedia > Summation

Article Content

Redirected from Summation

Addition is one of the basic operations of arithmetic. Addition combines two or more numbers, the summands, into a single number, the sum. (If there are only two terms, these are the augend and addend respectively.) For a definition of addition in the natural numbers, see Addition in N.

When adding finitely many numbers, it doesn't matter how you group the numbers and in which order you add them; you will always get the same result. (See Associativity and Commutativity.) If you add zero to any number, the quantity won't change; zero is the identity element for addition. The sum of any number and its additive inverse (in contexts where such a thing exists) is zero.

If the terms are all written out individually, then addition is written using the plus sign ("+"). Thus, the sum of 1, 2, and 4 is 1 + 2 + 4 = 7.

If the terms are not written out individually, then the sum may be written with an ellipsis to mark out the missing terms. Thus, the sum of all the natural numbers from 1 to 100 is 1 + 2 + ... + 99 + 100.

Alternatively, the sum can be with the summation symbol, which is a capital Sigma from the Greek alphabet. This is defined as:

$\sum_{i=m}^{n} x_{i} = x_{m} + x_{m+1} + x_{m+2} + ... + x_{n-1} + x_{n}.$
The subscript gives the symbol for a dummy variable (i in our case) and its lower value (m); the superscript gives its upper value. So for example:
$\sum_{i=2}^{6} i^{2} = 2^{2} + 3^{2} + 4^{2} + 5^{2} + 6^{2} = 90.$

One may also consider sums of infinitely many terms; these are called infinite series. Notationally, we would replace n above by the infinity symbol (∞). The sum of such a series is defined as the limit of the sum of the first n terms, as n grows without bound. That is:

$\sum_{i=m}^{\infty} x_{i} := \lim_{n\to\infty} \sum_{i=m}^{n} x_{i}.$
One can similarly replace m with negative infinity, and
$\sum_{i=-\infty}^\infty x_i := \lim_{n\to\infty}\sum_{i=-n}^m x_i + \lim_{n\to\infty}\sum_{i=m+1}^n x_i,$
for some integer m, provided both limits exist.

It's possible to add fewer than 2 numbers. If you add the single term x, then the sum is x.

If you add zero terms, then the sum is zero, because zero is the identity for addition. This is known as the empty sum. These degenerate cases are usually only used when the summation notation gives a degenerate result in a special case. For example, if m = n in the definition above, then there is only one term in the sum; if m = n + 1, then there is none.

Many other operations can be thought of as generalised sums. If a single term x appears in a sum n times, then the sum is nx, the result of a multiplication. If n is not a natural number, then the multiplication may still make sense, so that we have a sort of notion of adding a term, say, two and a half times.

A special case is multiplication by -1, which leads to the concept of the additive inverse, and to subtraction, the inverse operation[?] to addition.

The most general version of these ideas is the linear combination, where any number of terms are included in the generalised sum any number of times.

The following are useful identities:

$\sum_{i=1}^{n} i = \frac {n(n+1)}{2};$
$\sum_{i=1}^{n} (2i - 1) = n^2;$
$\sum_{i=0}^{n} i^{2} = \frac{n(n+1)(2n+1)}{6};$
$\sum_{i=0}^{n} i^{3} = \big(\frac{n(n+1)}{2}\big)^{2};$
$\sum_{i=0}^{n} x^{i} = \frac{x^{n+1}-1}{x-1}$ (see geometric series);
$\sum_{i=0}^{\infty} x^{i} = \frac{1}{1-x};$
$\sum_{i=0}^{n} {n \choose i} = 2^{n}$ (see binomial coefficient);
$\sum_{i=0}^{n-1} {i \choose k} = {n \choose k+1}.$

In general, the sum of the first n mth powers is

$\sum_{i=0}^n i^m = \frac{(n+1)^{m+1}}{m+1} + \sum_{k=1}^m\frac{B_k}{m-k+1}{m\choose k}(n+1)^{m-k+1},$
where $B_k$ is the kth Bernoulli number.

The following are useful approximations (using theta notation):

$\sum_{i=1}^{n} i^{c} = \theta(n^{c+1})$ for every real constant c other than -1;
$\sum_{i=1}^{n} \frac{1}{i} = \theta(\log{n});$
$\sum_{i=1}^{n} c^{i} = \theta(c^{n})$ for every real constant c other than 1;
$\sum_{i=1}^{n} \log(i)^{c} = \theta(n \cdot \log(n)^{c})$ for every nonnegative real constant c;
$\sum_{i=1}^{n} \log(i)^{c} \cdot i^{d} = \theta(n^{d+1} \cdot \log(n)^{c})$ for all nonnegative real constants c and d;
$\sum_{i=1}^{n} \log(i)^{c} \cdot i^{d} \cdot b^{i} = \theta (n^{d} \cdot \log(n)^{c} \cdot b^{n})$ for all nonnegative real constants b > 1, c, d.

All Wikipedia text is available under the terms of the GNU Free Documentation License

Search Encyclopedia
 Search over one million articles, find something about almost anything!

Featured Article
 Digital Rights Management ... Ross J Anderson on his Web site at www.cl.cam.ac.uk/~rja14/tcpa-faq.html/ for a clear discussion of two prominent proposals. Examples of existing "digital rights ...