Power Series
Given a sequence \(a_n\), the
power series assoiciated with \(a_n\) is the infinite sum
  |
\[
\begin{aligned}
f(x) = \sum_{n=0}^\infty a_n x^n = a_0 + a_1 x + a_2 x^2 + \cdots
\end{aligned}
\] |
Interval and radius of convergence
Given a power series, there exists some \(0 \le R \le \infty\) such that
- the series converges absolutely if \(\lvert x \rvert < R\)
- the series diverges if \(\lvert x \rvert > R\)
- might converge or diverge when \(x = R\) or \(x = -R\)
The
interval of convergence is the interval \((-R, R)\), possibly with the endpoints included (they need to be individually checked in general). The
radius of convergence is half the length of the interval of convergence.
To find the interval of convergence, use ratio test or root test to find the interval where the series converges absolutely and then check the endpoints of the interval. For example,
  |
\[
\begin{aligned}
\rho &= \lim_{n \rightarrow \infty} \frac{\lvert a_{n+1}x^{n+1} \rvert}{\lvert a_n x^n \rvert} = \lim_{n \rightarrow \infty} \frac{\lvert a_{n+1} \rvert}{\lvert a_n \rvert} \lvert x \rvert < 1 \\
\lvert x \rvert &< \lim_{n \rightarrow \infty} \frac{\lvert a_n \rvert}{\lvert a_{n+1} \rvert} = R \hskip2em \text{(radius of convergence)}
\end{aligned}
\] |
Shifted power series
  |
\[
\begin{aligned}
f(x) &= a_0 + a_1 (x - c) + a_2 (x - c)^2 + \cdots \\
\lvert x - c\rvert &< \lim_{n \rightarrow \infty} \frac{\lvert a_n \rvert}{\lvert a_{n+1} \rvert} = R
\end{aligned}
\] |
Interval of convergence: \((c - R, c + R)\). Such a power series is said to be centered at \(c\), since the interval of convergence for the series will be centered at \(c\).
Taylor series convergence
For what values of \(x\) does a function's Taylor series converge ?
The series:
  |
\[
\begin{aligned}
\sum_{n=0}^\infty a_n x^n &= \sum_{n=0}^\infty \frac{1}{n!} f^{(n)}(0) \, x^n \\
\end{aligned}
\] |
converges absolutely for \(\lvert x \rvert < R \), where
  |
\[
\begin{aligned}
R = \lim_{n \rightarrow \infty} \frac{\lvert a_n \rvert}{\lvert a_{n+1} \rvert}
= \lim_{n \rightarrow \infty} \frac{\lvert f^{(n)}(0) \rvert }{\lvert f^{(n+1)}(0) \rvert} \cdot \frac{(n+1)!}{n!}
= \lim_{n \rightarrow \infty} (n+1) \cdot \frac{\lvert f^{(n)}(0) \rvert }{\lvert f^{(n+1)}(0) \rvert} \\
\end{aligned}
\] |
Within the interval of convergence, differentiation and integration of a power series are nice, in that they can be done term by term.
When a Taylor series converges, it does not always converge to the function. But when it does, the function is referred to as real-analytic. That is, a function \(f\) is real-analytic at \(a\) if the Taylor series for \(f\) converges to \(f\) near \(a\).
Error defined
Given a convergent series,
  |
\[
\begin{aligned}
E_k &= s - s_k \\
&= \sum_{n=0}^\infty a_n - \sum_{n=0}^k a_n \\
&= \sum_{n=k+1}^\infty a_n \\
\end{aligned}
\] |
Alternating series error bound
  |
\[
\begin{aligned}
\lvert E_k \rvert \le \lvert a_{k+1} \rvert \\
\end{aligned}
\] |
Integral test for error bounds
  |
\[
\begin{aligned}
\int_{m+1}^\infty f(x)\,dx < \sum_{n=m+1}^\infty f(n) = E_m < \int_{m}^\infty f(x)\,dx \\
\end{aligned}
\] |
Taylor approximations
  |
\[
\begin{aligned}
f(x) &\approx \sum_{n=0}^N \frac{f^{(n)}(0)}{n!}\,x^n \\
f(x) &= \sum_{n=0}^N \frac{f^{(n)}(0)}{n!}\,x^n + E_N(x) \\
\end{aligned}
\] |
Weak bound
  |
\[
\begin{aligned}
E_N(x) \le C\,x^{N+1} = O(x^{N+1})
\end{aligned}
\] |
for some constant \(C\) and \(x\) sufficiently close to 0.
Taylor Remainder Theorem
  |
\[
\begin{aligned}
E_N(x) = \frac{f^{(n+1)}(t)}{(N+1)!}x^{N+1}
\end{aligned}
\] |
for \(\lvert t \rvert \le \lvert x \rvert \), which is not very useful since there is no way to determine \(t\).
Taylor error bound
  |
\[
\begin{aligned}
\lvert E_N(x) \rvert \le \frac{max(\lvert f^{(n+1)}(t) \rvert)}{(N+1)!}\lvert x^{N+1} \rvert
\end{aligned}
\] |
for \(\lvert t \rvert \le \lvert x \rvert \).
# posted by rot13(Unafba Pune) @ 4:02 PM