${}$
Theorem 3.3.1 is a statement of Taylor's theorem, expressing a sufficiently smooth function as the sum of a polynomial and a remainder term.
${}$
Functions for which the remainder term goes to zero for all $x$ in some interval about the expansion point are essentially given by an "infinite polynomial" or, in terms of Chapter 8, by an infinite series. Thus, a function with an appropriately behaved remainder has a power series representation, and this series is called a Taylor series.
When the expansion point is $x\=0$, the power series representation of $f$ is sometimes called a Maclaurin series, but some authors will simplify the terminology and use just the term "Taylor series" for all convergent power-series.
Thus, if a power series converges to $f\left(x\right)$, then that series is the Taylor series of $f$. But given an arbitrary function $f$, even one for which all derivatives exist, the expansion (called the formal Taylor expansion)
$\sum _{n\=0}^{\infty}\frac{{f}^{\left(n\right)}\left(c\right)}{n\!}{\left(x-c\right)}^{n}$
which is formed by the "Taylor series recipe" may not converge to $f\left(x\right)$. (See Example 8.5.1.)
Once again, a convergent power series is the Taylor series for the limit function, but the formal Taylor expansion of $f$ may not be the power-series representation of $f$. because $f$ may not have a power-series representation.
Section 8.5 deals with functions that indeed have a Taylor series representation. Determining which functions actually have a power-series (and hence a Taylor series) representation is no small matter. The most satisfying answers to this question are given for functions of a complex variable, that is, for functions $f\left(z\right)$, where $z\=x\+iy$. For such functions, if one derivative exists in a neighborhood, all derivatives exist and the Taylor expansion actually represents the function. But for functions of the real variable $x$, the situation is not so sanguine. Real functions can have just a finite number of derivatives and no more. Moreover, even functions with all derivatives may not have a Taylor series that converges back to the function, as is the case with the example function in Example 8.5.1.
If it can be shown that the Taylor-expansion remainder
${\stackrel{\^}{R}}_{n}\left(x\right)\=\frac{{f}^{\left(n\+1\right)}\left(x\right)}{\left(n\+1\right)\!}{\left(x-a\right)}^{n\+1}$
(not just ${R}_{n}\left(x\right)\=\frac{{f}^{\left(n\+1\right)}\left(c\right)}{\left(n\+1\right)\!}{\left(x-a\right)}^{n\+1}$ from Theorem 3.3.1) goes to zero as $n\to \infty$, then the formal Taylor series of $f\left(x\right)$ does indeed converge to, i.e., represent, $f\left(x\right)$. But it is no easy matter to show this for an arbitrary function, essentially because of the need to have either a representation of ${f}^{\left(n\right)}\left(x\right)$ or an estimate of how these derivatives behave as $n\to \infty$. In the examples below, this is done for a few of the elementary functions, but in general, determining whether or not a function has a Taylor series representation relies heavily on the theory of complex variables.