In mathematics, we often find ourselves wanting to add up more terms than we're willing to write down. For instance, if we wanted to add up all the numbers from 1 to 100, we could write:
1 + 2 + 3 + 4 + 5 + 6 + 7 + 8 + 9 + 10 + 11 + 12 + 13 + 14 + 15 + 16 + 17 + 18 + 19 + 20 + 21 + 22 + 23 + 24 + 25 + 26 + 27 + 28 + 29 + 30 + 31 + 32 + 33 + 34 + 35 + 36 + 37 + 38 + 39 + 40 + 41 + 42 + 43 + 44 + 45 + 46 + 47 + 48 + 49 + 50 + 51 + 52 + 53 + 54 + 55 + 56 + 57 + 58 + 59 + 60 + 61 + 62 + 63 + 64 + 65 + 66 + 67 + 68 + 69 + 70 + 71 + 72 + 73 + 74 + 75 + 76 + 77 + 78 + 79 + 80 + 81 + 82 + 83 + 84 + 85 + 86 + 87 + 88 + 89 + 90 + 91 + 92 + 93 + 94 + 95 + 96 + 97 + 98 + 99 + 100,
But that's tediously long to write down or work with. Or we could write:
1 + 2 + 3 + ... + 98 + 99 + 100,
But expressions like these require a bit of human intuition to determine what goes in place of the elipsis. For instance, if we wanted to add up numbers of the form .4 + .0048*k for values of k between 0 and 99, we would write:
.4 + .4048 + .4096 + ... + .8656 + .8704 + .8752,
In this case it's not immediately clear what goes in place of the elipsis. We want something precise and formal so we can do math to it. Our notation needs a formula for the ith term, and somewhere we can write how many terms we're working with.
Standard mathematical notation is This notation means to plug in all the values from 1 to n in for i into f (i), and add up the results.
Play around with the summation interactive below. Start by typing in a specific formula for f (i). You can also change the lower bound (where it says i=). As with most things in math, you can also write summations with a variable other than i.
|
Of course the power of summation notation isn't in its ability to write down sums that we could just evaluate by hand or on a computer. Summation notation shines when we want to evaluate a sum of a grossly enormous number of terms (quadrillions or more) or, if we don't know ahead of time how many terms we want to add up. For instance, we can find a nice formula for the sum of the first n squares or the sum of the first n cubes: simple polynomials that we can evaluate quickly without having to actually compute the first n squares or cubes and add them up. We can also use our understanding of polynomials to study the behavior of these sums: What sorts of numbers can these sums give us? How do they grow as n gets larger and larger?
The question of turning a summation into a formula without a summation in it (a "closed formula") is really important to mathematicians because of how much easier it is to evaluate these nice summation-less formulas and study their mathematical properties. But how can we do this?
Our technique will be similar to our technique for evaluating derivatives: we'll learn how to handle some basic pieces, then learn how to handle when those pieces are put together.
Just as with derivatives, let's start by looking at powers of our variable (for derivatives, it was probably x and in this case it's i, but the choice of which variable we use doesn't really make a difference, as long as we keep track of which variable we're plugging in for with our sum.)
Note that if we want to calculate the sum of the first [some absurdly giant number] squares or cubes or such, we can just plug that absurdly giant number in for n, turning some absurdly large number of sums to plugging into a much more manageable formula.
It's really tempting to try to see a pattern in the summation formulas above, and in fact there is. You can read more about it on Wikipedia or play around with sums of powers here. Alas, what we'd really want is a nice simple (closed) formula that works for any power, but no such formula exists.
If we switch things around, and let i be the exponent instead of the base, we get the following formulas:
For which there is a nice closed formula that works for any base (except 1, but we know how to add up powers of 1). You can read more about it on Wikipedia.
The formulas above are most of what you need, but below are a few other neat sum formulas:
Just like with derivatives, you can pull out a constant from under a summation. If you unpack the sum, it's just factoring out a common factor. Note that in order to be able to factor it out, the term must not depend on the variable i: otherwise it won't be the same term from summand to summand.
|
We can pull expressions that don't depend on i out of our sums (such as a, n, and 3n2), since they're the same in each summand:
Below you can experiment with a slightly more general interactive to the one above. Try entering constant multipliers into the textbox that contain variables like a, n, or i, and see if they can or cannot be pulled out front of the summation.
|
You can also break apart summations of sums. In this case, if we unpack the sum, we're just reordering our summands.
|
We can combine these rules to calculate summations of polynomials in i:
Degree:
|
The last step follows by some gross, but straightforward algebra.
Together these two properties (being able to pull out constants and break up sums) show up very frequently throughout mathematics. There is an entire branch of mathematics dedicated to studying these properties called linear algebra.
So far, we've been focusing on sums from 1 to n, and manipulating the inside of the sum. But we have lots of options for the bounds. How can we deal with them?
We have a very simple technique for modifying the upper bound of a summation:
If an equation is true for any n (or x, or whatever variable), we can replace n everywhere in the equation by whatever we want and get a new true equation.This does not apply if:
- The equation is true only for some n, such as if we're solving for a particular n. In this case we can only replace n by some expression that has a value for which the equation is true.
- We're trying to replace n with the name of a function, or some other wrong type of thing.
This is true throughout mathematics, but let's see how it can help us with summations. Enter an upper bound below:
|
There is also a really helpful relationship if we change the upper bound just by 1:
|
Check to see that the interactive above and the one above that agree on what formulas they give you for the sum from i=1 to n+1 for various summands.
Similarly, we can change the lower bound to 0. Some mathematicians like to count starting at 0, and some summation formulas are nicer if we start from 0, so it is very common to see sums starting at 0.
|
If we want to have some other lower bound, the following equation is helpful (for 1 ≤ m ≤ n):
In other words,
I currently don't have the technology to evaluate summations locally, but you can use the input box below to ask the Sage Cell Server if it can produce a closed formula for a given sum. Simply replace the "binomial(n,i)" with whatever you like and click the "send" button. Sage supports:
⚠ Caution: Sage is very picky about multiplication. Write "x y" or "x*y" instead of "xy" to mean x times y.
Uncheck to make Sage even more picky about multiplication (always need to write *). Show full equation.
|
Or, if you want to really get into the underlying theory of closed form formulas for summations, Knuth's A=B goes into nearly as much detail as one could possibly hope for.