In
computer science, in the field of
numerical analysis,
Simpson's Rule is a way to get an approximation of an integral:
- <math> \int_{a}^{b} f(x) dx</math>
using an interpolating polynomial of higher degree. Simpson's rule belong to the family of rules derived from Newton-Cotes formulas. The most common is a quadratic polynomial interpolating at a, (a+b)/2, and b which gives us the polynomial:
- <math> P(x) = f(a) + (x-a)f\left[a,\left( \frac{a+b}{2} \right)\right] + \left(x-\left( \frac{a+b}{2} \right)\right)(x-a)f\left[a,\left( \frac{a+b}{2} \right), b\right] </math>
From this Simpson's Rule is:
- <math> \int_{a}^{b} f(x) dx \approx S(f) = \frac{b-a}{6}\left[f(a) + 4f\left(\frac{a+b}{2}\right)+f(b)\right]
</math>
We want to have our polynomial on the form:
- <math>P(x) = \alpha x^2 + \beta x + \gamma</math>
Assume we have the function values <math>a=x_0</math>, <math>\frac{a+b}{2}=x_1</math> and <math>b=x_2</math>. The situation will look like this, with our sampled function values at <math>f(a)</math>, <math>f\left(\frac{a+b}{2}\right)</math> and <math>f(b)</math>:
As this Simpson's rule apply to equidistant points, we know that <math>x_0 < x_1 < x_2</math> and that <math>x_1-x_0 = x_2-x_1</math>. This means we may transport our solution to the intervals formed by <math>-h, 0, h</math> such that
- <math>h \equiv \frac{a+b}{2}</math>
We need to interpolate these values and function values with a polynomial and form our equations:
- <math>f(-h) = \alpha h^2 - \beta h + \gamma</math>
- <math>f(0) = \gamma</math>
- <math>f(h) = \alpha h^2 + \beta h + \gamma</math>
Which yields:
- <math>\alpha = \frac{f(-h) - 2f(0) + f(h)}{2h^2}</math>
- <math>\beta = \frac{f(h) - f(-h)}{2h}</math>
- <math>\gamma = f(0)</math>
We then integrate our polynomial:
- <math>\int_{-h}^h P(x) dx =</math>
- <math>\int_{-h}^h \frac{f(-h) - 2f(0) + f(h)}{2h^2}x^2 +
\frac{f(h) - f(-h)}{2h}x +
f(0) dx =</math>
- <math>\left[\frac{f(-h) - 2f(0) + f(h)}{6h^2}x^3 +
\frac{f(h) - f(-h)}{4h}x^2 +
f(0)x \right]_{-h}^h =</math>
- <math>\frac{f(-h) - 2f(0) + f(h)}{6h^2}h^3 +
\frac{f(h) - f(-h)}{4h}h^2 +
f(0)h +</math>
- <math>\frac{f(-h) - 2f(0) + f(h)}{6h^2}h^3 -
\frac{f(h) - f(-h)}{4h}h^2 +
f(0)h =</math>
- <math>\frac{f(-h) - 2f(0) + f(h)}{3h^2}h^3 +
2f(0)h</math>
Substitute back our original values:
- <math>\frac{f(a) - 2f\left(\frac{a+b}{2}\right) + f(b)}{3\left(\frac{b-a}{2}\right)^2}\left(\frac{b-a}{2}\right)^3 +
2f\left(\frac{a+b}{2}\right) \left(\frac{b-a}{2}\right) =</math>
- <math>\frac{f(a) - 2f\left(\frac{a+b}{2}\right) + f(b)}{3}\left(\frac{b-a}{2}\right) +
2f\left(\frac{a+b}{2}\right) \left(\frac{b-a}{2}\right) = </math>
- <math>\frac{b-a}{6} \left(f(a) - 2f\left(\frac{a+b}{2}\right) + f(b) +
6f\left(\frac{a+b}{2}\right) \right) = </math>
- <math>\frac{b-a}{6} \left(f(a) + 4f\left(\frac{a+b}{2}\right) + f(b) \right)</math>
Q.E.D.
To examine the accuracy of the rule, take <math>c = \frac{a+b}{2}</math>, so
- <math> \int_{a}^{b} f(x) dx = \int_{a}^{c} f(x) dx + \int_{c}^{b} f(x) dx</math>
Using integration by parts we get:
- <math> \int_{a}^{c} f(x) dx = f(x)(x-\alpha)|^c_a - \int_{a}^{c} f'(x)(x-\alpha) dx </math>
and
- <math>\int_{c}^{b} f(x) dx = f(x)(x-\beta)|^b_c - \int_{c}^{b} f'(x)(x-\beta) dx</math>
where α and β are constants that we can choose. Adding these expressions, we get:
- <math> \int_{a}^{b} f(x) dx = f(b)(b-\beta)+f(c)(\beta-\alpha)+f(a)(\alpha-a) - \int_{a}^{c} f'(x)(x-\alpha) dx - \int_{c}^{b} f'(x)(x-\beta) dx</math>
Let's take α and β, so as to get Simpson's Rule:
- <math>\alpha = \frac{b+5a}{6}, \beta = \frac{5b+a}{6}</math>
and defining the function Py(x) by:
<math>P_y(x)=\left\{\begin{matrix} x-\alpha, & \mbox{if }a\le x \le c \\ x-\beta, & \mbox{if }c < x \le b \end{matrix}\right.</math>
we have
- <math> \int_{a}^{b} f(x) dx = S(f) - \int_{a}^{b}f'(x)P_y(x)dx</math>
where
- <math>\int_{a}^{b}f'(x)P_y(x)dx</math>
is the error value.
All Wikipedia text
is available under the
terms of the GNU Free Documentation License