integral test for convergence

{{Short description|Test for infinite series of monotonous terms for convergence}}

File:Integral Test.svg. Since the area under the curve {{math|y {{=}} 1/x}} for {{math|x ∈ {{closed-open|1, ∞}}}} is infinite, the total area of the rectangles must be infinite as well.]]

{{Calculus|Series}}

In mathematics, the integral test for convergence is a method used to test infinite series of monotonic terms for convergence. It was developed by Colin Maclaurin and Augustin-Louis Cauchy and is sometimes known as the Maclaurin–Cauchy test.

Statement of the test

Consider an integer {{math|N}} and a function {{math|f}} defined on the unbounded interval {{closed-open|N, ∞}}, on which it is monotone decreasing. Then the infinite series

:\sum_{n=N}^\infty f(n)

converges to a real number if and only if the improper integral

:\int_N^\infty f(x)\,dx

is finite. In particular, if the integral diverges, then the series diverges as well.

=Remark=

If the improper integral is finite, then the proof also gives the lower and upper bounds

{{NumBlk|:|\int_N^\infty f(x)\,dx\le\sum_{n=N}^\infty f(n)\le f(N)+\int_N^\infty f(x)\,dx|{{EquationRef|1}}}}

for the infinite series.

Note that if the function f(x) is increasing, then the function -f(x) is decreasing and the above theorem applies.

Many textbooks require the function f to be positive,{{cite book |last1=Stewart |first1=James |last2=Clegg |first2=Daniel |last3=Watson |first3=Saleem |title=Calculus: Metric Version |date=2021 |publisher=Cengage |isbn=9780357113462 |edition=9}}{{cite book |last1=Wade |first1=William |title=An Introduction to Analysis |date=2004 |publisher=Pearson Education |isbn=9780131246836 |edition=3}}{{cite book |last1=Thomas |first1=George |last2=Hass |first2=Joel |last3=Heil |first3=Christopher |last4=Weir |first4=Maurice |last5=Zuleta |first5=José Luis |title=Thomas' Calculus: Early Transcendentals |date=2018 |publisher=Pearson Education |isbn=9781292253114 |edition=14}} but this condition is not really necessary, since when f is negative and decreasing both \sum_{n=N}^\infty f(n) and \int_N^\infty f(x)\,dx diverge.{{cite web |url=https://math.stackexchange.com/q/3577379 |title=Why does it have to be positive and decreasing to apply the integral test? |author=savemycalculus |website=Mathematics Stack Exchange |access-date=2020-03-11}}{{better source needed|date=August 2024}}

Proof

The proof uses the comparison test, comparing the term f(n) with the integral of f over the intervals [n-1,n) and [n,n+1) respectively.

The monotonic function f is continuous almost everywhere. To show this, let

:D=\{ x\in [N,\infty)\mid f\text{ is discontinuous at } x\}

For every x\in D, there exists by the density of \mathbb Q, a c(x)\in\mathbb Q so that c(x)\in\left[\lim_{y\downarrow x} f(y), \lim_{y\uparrow x} f(y)\right].

Note that this set contains an open non-empty interval precisely if f is discontinuous at x. We can uniquely identify c(x) as the rational number that has the least index in an enumeration \mathbb N\to\mathbb Q and satisfies the above property. Since f is monotone, this defines an injective mapping c:D\to\mathbb Q, x\mapsto c(x) and thus D is countable. It follows that f is continuous almost everywhere. This is sufficient for Riemann integrability.{{Cite journal| issn = 0002-9890| volume = 43

| issue = 7| pages = 396–398| last = Brown| first = A. B.| title = A Proof of the Lebesgue Condition for Riemann Integrability| journal = The American Mathematical Monthly| date = September 1936| jstor = 2301737| doi = 10.2307/2301737}}

Since {{math|f}} is a monotone decreasing function, we know that

:

f(x)\le f(n)\quad\text{for all }x\in[n,\infty)

and

:

f(n)\le f(x)\quad\text{for all }x\in[N,n].

Hence, for every integer {{math|nN}},

{{NumBlk|:|

\int_n^{n+1} f(x)\,dx

\le\int_{n}^{n+1} f(n)\,dx

=f(n)|{{EquationRef|2}}}}

and, for every integer {{math|nN + 1}},

{{NumBlk|:|

f(n)=\int_{n-1}^{n} f(n)\,dx

\le\int_{n-1}^n f(x)\,dx.

|{{EquationRef|3}}}}

By summation over all {{math|n}} from {{math|N}} to some larger integer {{math|M}}, we get from ({{EquationNote|2}})

:

\int_N^{M+1}f(x)\,dx=\sum_{n=N}^M\underbrace{\int_n^{n+1}f(x)\,dx}_{\le\,f(n)}\le\sum_{n=N}^Mf(n)

and from ({{EquationNote|3}})

:

\begin{align}

\sum_{n=N}^Mf(n)&=f(N)+\sum_{n=N+1}^Mf(n)\\

&\leq f(N)+\sum_{n=N+1}^M\underbrace{\int_{n-1}^n f(x)\,dx}_{\ge\,f(n)}\\

&=f(N)+\int_N^M f(x)\,dx.

\end{align}

Combining these two estimates yields

:\int_N^{M+1}f(x)\,dx\le\sum_{n=N}^Mf(n)\le f(N)+\int_N^M f(x)\,dx.

Letting {{math|M}} tend to infinity, the bounds in ({{EquationNote|1}}) and the result follow.

Applications

The harmonic series

:

\sum_{n=1}^\infty \frac 1 n

diverges because, using the natural logarithm, its antiderivative, and the fundamental theorem of calculus, we get

:

\int_1^M \frac 1 n\,dn = \ln n\Bigr|_1^M = \ln M \to\infty

\quad\text{for }M\to\infty.

On the other hand, the series

:

\zeta(1+\varepsilon)=\sum_{n=1}^\infty \frac1{n^{1+\varepsilon}}

(cf. Riemann zeta function)

converges for every {{math|ε > 0}}, because by the power rule

:

\int_1^M\frac1{n^{1+\varepsilon}}\,dn

= \left. -\frac 1{\varepsilon n^\varepsilon} \right|_1^M=

\frac 1 \varepsilon \left(1-\frac 1 {M^\varepsilon}\right)

\le \frac 1 \varepsilon < \infty

\quad\text{for all }M\ge1.

From ({{EquationNote|1}}) we get the upper estimate

:

\zeta(1+\varepsilon)=\sum_{n=1}^\infty \frac 1 {n^{1+\varepsilon}} \le \frac{1 + \varepsilon}\varepsilon,

which can be compared with some of the particular values of Riemann zeta function.

Borderline between divergence and convergence

The above examples involving the harmonic series raise the question of whether there are monotone sequences such that {{math|f(n)}} decreases to 0 faster than {{math|1/n}} but slower than {{math|1/n1+ε}} in the sense that

:

\lim_{n\to\infty}\frac{f(n)}{1/n}=0

\quad\text{and}\quad

\lim_{n\to\infty}\frac{f(n)}{1/n^{1+\varepsilon}}=\infty

for every {{math|ε > 0}}, and whether the corresponding series of the {{math|f(n)}} still diverges. Once such a sequence is found, a similar question can be asked with {{math|f(n)}} taking the role of {{math|1/n}}, and so on. In this way it is possible to investigate the borderline between divergence and convergence of infinite series.

Using the integral test for convergence, one can show (see below) that, for every natural number {{math|k}}, the series

{{NumBlk|:|

\sum_{n=N_k}^\infty\frac1{n\ln(n)\ln_2(n)\cdots \ln_{k-1}(n)\ln_k(n)}

|{{EquationRef|4}}}}

still diverges (cf. proof that the sum of the reciprocals of the primes diverges for {{math|k {{=}} 1}}) but

{{NumBlk|:|

\sum_{n=N_k}^\infty\frac1{n\ln(n)\ln_2(n)\cdots\ln_{k-1}(n)(\ln_k(n))^{1+\varepsilon}}

|{{EquationRef|5}}}}

converges for every {{math|ε > 0}}. Here {{math|lnk}} denotes the {{math|k}}-fold composition of the natural logarithm defined recursively by

:

\ln_k(x)=

\begin{cases}

\ln(x)&\text{for }k=1,\\

\ln(\ln_{k-1}(x))&\text{for }k\ge2.

\end{cases}

Furthermore, {{math|Nk}} denotes the smallest natural number such that the {{math|k}}-fold composition is well-defined and {{math|lnk(Nk) ≥ 1}}, i.e.

:

N_k\ge \underbrace{e^{e^{\cdot^{\cdot^{e}}}}}_{k\ e'\text{s}}=e \uparrow\uparrow k

using tetration or Knuth's up-arrow notation.

To see the divergence of the series ({{EquationNote|4}}) using the integral test, note that by repeated application of the chain rule

:

\frac{d}{dx}\ln_{k+1}(x)

=\frac{d}{dx}\ln(\ln_k(x))

=\frac1{\ln_k(x)}\frac{d}{dx}\ln_k(x)

=\cdots

=\frac1{x\ln(x)\cdots\ln_k(x)},

hence

:

\int_{N_k}^\infty\frac{dx}{x\ln(x)\cdots\ln_k(x)}

=\ln_{k+1}(x)\bigr|_{N_k}^\infty=\infty.

To see the convergence of the series ({{EquationNote|5}}), note that by the power rule, the chain rule and the above result

:

-\frac{d}{dx}\frac1{\varepsilon(\ln_k(x))^\varepsilon}

=\frac1{(\ln_k(x))^{1+\varepsilon}}\frac{d}{dx}\ln_k(x)

=\cdots

=\frac{1}{x\ln(x)\cdots\ln_{k-1}(x)(\ln_k(x))^{1+\varepsilon}},

hence

:

\int_{N_k}^\infty\frac{dx}{x\ln(x)\cdots\ln_{k-1}(x)(\ln_k(x))^{1+\varepsilon}}

=-\frac1{\varepsilon(\ln_k(x))^\varepsilon}\biggr|_{N_k}^\infty<\infty

and ({{EquationNote|1}}) gives bounds for the infinite series in ({{EquationNote|5}}).

See also

References

{{Calculus topics}}

Category:Augustin-Louis Cauchy

Category:Integral calculus

Category:Convergence tests

Category:Articles containing proofs