# Lecture 03

*212,851*pages on

this wiki

## Jakob Bernoulli & Convergence Series

An infinite series is an ordered sequence of numbers usually given as a formula representing the n^{th} element.

Infinite series were the subject of mathematical study by the 1660s.

## Convergence/Divergence

When we take the sum (or product) of a series, the series is found to *converge* (arrive at a limiting value) or *diverge* (grow without bounds).

The mathematician Cauchy stressed the importance of considering the convergent or divergent behaviour of these sums and products of series.

### Numerical Series

A *numerical series* in :

### Functional Series

And this is a *functional series*:

### Power Series

A *power series* has the form:

## Geometrical Reasoning About Infinite Series

Consider the series:

In this case, we have the surprising situation that inspection of a geometrical figure leads to a result about infinite series. The following figure, depicting recursive subdivision of the unit square, makes clear why this series converges to 1:

## Newton's Binomial Theorem

Isaac Newton arrived at this theorem in the 1660s:

Let ; then this infinite series, the "Taylor polynomial", gives one the square root of an arbitrary number.

TO DO: insert figure (unit-circle-axes)

(unit circle)

Newton used the Binomial Theorem, discussed previously, to evaluate to obtain an infinite series and then perform term-by-term integration.

Newton thus obtained an early estimate of the value of π as the area of the unit circle.

## Harmonic Series

The harmonic series, so-named because the values used in the denominators are related to the note values commonly found in musical harmonies, looks like:

This infinite series was studied by Jakob Bernoulli, whose approach was to take partial sums of the series in order to see whether it approached some limited value.

- adding 83 terms yields ≈ 5
- adding 1200 terms yields ≈ 10
- adding 25,000,000 terms yields ≈ 20

Does this series converge or diverge?

as , so intuition suggests that the series converges.

However, that is not the case. This series actually *diverges*, but very slowly…

A series *converges* as the n^{th} term approaches 0, but the converse is not true. [explain further]

Leibniz, who was living in Paris and working as an ambassador in the 1670s, explained the divergence using the "triangular numbers" as follows:

TO DO: insert figure (triangular numbers)

Note:

This is a "telescoping series" that evaluates to 1 because we can cancel terms, e.g. -1/2 and +1/2:

Jakob Bernoulli published his results in Basel, Switzerland, in 1684, in a book entitled *Acta Eruditorum*:

By subtracting from both sides of the equation:

By subtracting from both sides of the equation:

By subtracting from both sides of the equation:

etc.

Now add the leftmost columns:

Now add the rightmost columns:

This implies that (adding 1 to is still ). We conclude that the harmonic series is divergent.

Today we would prove divergence by examining the partial sums, seeing where they trend, i.e. towards or away from zero.

### The Basel Problem

Here is an example of a convergent sequence:

Jakob Bernoulli asked the question: what is S? This was a major unsolved problem in early 18^{th} century mathematics.

Leonhard Euler received private math lessons from the younger Bernoulli, Johann, and who achieved as his first major result the solution to this problem:

The subject of the first course reading is the method in which Euler solved this problem.