# 15.5: Formal Power Series

• • Eric Lehman, F. Thomson Leighton, & Alberty R. Meyer
• Google and Massachusetts Institute of Technology via MIT OpenCourseWare
$$\newcommand{\vecs}{\overset { \rightharpoonup} {\mathbf{#1}} }$$ $$\newcommand{\vecd}{\overset{-\!-\!\rightharpoonup}{\vphantom{a}\smash {#1}}}$$$$\newcommand{\id}{\mathrm{id}}$$ $$\newcommand{\Span}{\mathrm{span}}$$ $$\newcommand{\kernel}{\mathrm{null}\,}$$ $$\newcommand{\range}{\mathrm{range}\,}$$ $$\newcommand{\RealPart}{\mathrm{Re}}$$ $$\newcommand{\ImaginaryPart}{\mathrm{Im}}$$ $$\newcommand{\Argument}{\mathrm{Arg}}$$ $$\newcommand{\norm}{\| #1 \|}$$ $$\newcommand{\inner}{\langle #1, #2 \rangle}$$ $$\newcommand{\Span}{\mathrm{span}}$$ $$\newcommand{\id}{\mathrm{id}}$$ $$\newcommand{\Span}{\mathrm{span}}$$ $$\newcommand{\kernel}{\mathrm{null}\,}$$ $$\newcommand{\range}{\mathrm{range}\,}$$ $$\newcommand{\RealPart}{\mathrm{Re}}$$ $$\newcommand{\ImaginaryPart}{\mathrm{Im}}$$ $$\newcommand{\Argument}{\mathrm{Arg}}$$ $$\newcommand{\norm}{\| #1 \|}$$ $$\newcommand{\inner}{\langle #1, #2 \rangle}$$ $$\newcommand{\Span}{\mathrm{span}}$$$$\newcommand{\AA}{\unicode[.8,0]{x212B}}$$

## Divergent Generating Functions

Let $$F(x)$$ be the generating function for $$n!$$, that is,

$\nonumber F(x) ::= 1 + 1x + 2x^2 + \cdots + n!x^n + \cdots.$

Because $$x^n = o(n!)$$ for all $$x \neq 0$$, this generating function converges only at $$x = 0$$.3

Next, let $$H(x)$$ be the generating function for $$n \cdot n!$$, that is,

$\nonumber H(x) ::= 0 + 1x + 4x^2 + \cdots + n \cdot n! x^n\ + \cdots.$

Again, $$H(x)$$ converges only for $$x = 0$$, so $$H(x)$$ and $$F(x)$$ describe the same, trivial, partial function on the reals.

On the other hand, $$F(x)$$ and $$H(x)$$/ have different coefficients for all powers of $$x$$ greater than 1, and we can usefully distinguish them as formal, symbolic objects.

To illustrate this, note than by subtracting 1 from $$F(x)$$ and then dividing each of the remaining terms by $$x$$, we get a series where the coefficient if $$x^n$$ is $$(n+1)!$$. That is

$\label{15.5.1} [x^n]\left( \dfrac{F(x) - 1}{x}\right) = (n+1)!.$

Now a little further formal reasoning about $$F(x)$$ and $$H(x)$$ will allow us to deduce the following identity for $$n!$$: 4

$\label{15.5.2} n! = 1 + \sum_{i = 1}^{n}(i - 1) \cdot (i - 1)!$

To prove this identity, note that from (\ref{15.5.1}), we have

$\nonumber [x^n]H(x) ::= n \cdot n! = (n+1)! - n! = [x^n] \left( \dfrac{F(x) - 1}{x}\right) - [x^n]F(x).$

In other words,

$\label{15.5.3} H(x) = \dfrac{F(x) - 1}{x} - F(x),$

Solving (\ref{15.5.3}) for $$F(x)$$, we get

$\label{15.5.4} F(x) = \dfrac{xH(x) + 1}{1-x}.$

But $$[x^n](xH(x) + 1)$$ is $$(n-1) \cdot (n-1)!$$ for $$n \geq 1$$ and is 1 for $$n = 0$$, so by the convolution formula,

$\nonumber [x^n] \left( \dfrac{xH(x) + 1}{1-x}\right) = 1 + \sum_{i=1}^{n} (i-1)\cdot (i-1)!.$

The identity (\ref{15.5.2}) now follows immediately from (\ref{15.5.4}).

## The Ring of Power Series

So why don’t we have to worry about series whose radius of convergence is zero, and how do we justify the kind of manipulation in the previous section to derive the formula (\ref{15.5.4})? The answer comes from thinking abstractly about infinite sequences of numbers and operations that can be performed on them.

For example, one basic operation combining two infinite sequences is adding them coordinatewise. That is, if we let

\begin{aligned} G &::= (g_0, g_1, g_2, \ldots), \\ H &::= (h_0, h_1, h_2, \ldots), \end{aligned}

then we can define the sequence sum, $$\oplus$$, by the rule:

$\nonumber G \oplus H ::= (g_0 + h_0, g_1 + h_1, \ldots, g_n + h_n, \ldots).$

Another basic operation is sequence multiplication, $$\otimes$$, defined by the convolution rule (not coordinatewise):

$\nonumber G \otimes H ::= \left( g_0 + h_0, g_0h_1 +g_1h_0 , \ldots, \sum_{i=0}^{n} g_i h_{n-i}, \ldots \right).$

These operations on infinite sequences have lots of nice properties. For example, it’s easy to check that sequence addition and multiplication are commutative:

\begin{aligned} G \oplus H &= H \oplus G, \\ G \otimes H &= H \otimes G.\end{aligned}

If we let

\begin{aligned} Z &::= (0, 0, 0, \ldots), \\ I&::= (1, 0, 0, 0, \ldots, 0, \ldots), \end{aligned}

then it’s equally easy to check that $$Z$$ acts like a zero for sequences and $$I$$ acts like the number one:

\begin{align} \nonumber Z \oplus G &= G, \\ \label{15.5.5} Z \otimes G &= Z, \\ \nonumber I \otimes G &= G.\end{align}

Now if we define

$\nonumber -G ::= (-g_0, -g_1, -g_2, \ldots)$

then

$\nonumber G \oplus (-G) = Z.$

In fact, the operations $$\oplus$$ and $$\otimes$$ satisfy all the commutative ring axioms described in Section 8.7.1. The set of infinite sequences of numbers together with these operations is called the ring of formal power series over these numbers.5

A sequence $$H$$ is the reciprocal of a sequence $$G$$ when

$\nonumber G \otimes H = I.$

A reciprocal of $$G$$ is also called a multiplicative inverse or simply an “inverse” of $$G$$. The ring axioms imply that if there is a reciprocal, it is unique (see Problem 8.32), so the suggestive notation $$1/G$$ can be used unambiguously to denote this reciprocal, if it exists. For example, letting

\begin{aligned} J &::= (1, -1, 0, 0, \ldots 0, \ldots) \\ K &::= (1, 1, 1, 1, \ldots, 1, \ldots), \end{aligned}

the definition of $$\otimes$$ implies that $$J \otimes K = I$$, and so $$K = 1/J$$ and $$J = 1/Ke$$.

In the ring of formal power series, equation (\ref{15.5.5}) implies that the zero sequence $$Z$$ has no inverse, so $$1/Z$$ is undefined—just as the expression 1/0 is undefined over the real numbers or the ring $$\mathbb{Z}_n$$ of Section 8.7.1. It’s not hard to verify that a series has an inverse iff its initial element is nonzero (see Problem 15.26).

Now we can explain the proper way to understand a generating function definition

$\nonumber G(x) ::= \sum_{n = 0}^{\infty} g_n x^n.$

It simply means that $$G(x)$$ really refers to its infinite sequence of coefficients $$(g_0, g_1, \ldots)$$ in the ring of formal power series. The simple expression, $$x$$, can be understood as referring to the sequence

$\nonumber X ::= (0, 1, 0, 0, \ldots, 0, \ldots).$

Likewise, $$1 - x$$ abbreviates the sequence $$J$$ above, and the familiar equation

$\label{15.5.6} \dfrac{1}{1-x} = 1 + x + x^2 + x^3 + \cdots$

can be understood as a way of restating the assertion that $$K$$ is $$1/J$$. In other words, the powers of the variable $$x$$ just serve as a place holders—and as reminders of the definition of convolution. The equation (\ref{15.5.6}) has nothing to do with the values of $$x$$ or the convergence of the series. Rather, it is stating a property that holds in the ring of formal power series. The reasoning about the divergent series in the previous section is completely justified as properties of formal power series.

3This section is based on an example from “Use of everywhere divergent generating function,” mathoverflow, response 8,147 by Aaron Meyerowitz, Nov. 12, 2010.

4A combinatorial proof of (\ref{15.5.2}) is given in Problem 14.64.

5The elements in the sequences may be the real numbers, complex numbers, or, more generally, may be the elements from any given commutative ring.

This page titled 15.5: Formal Power Series is shared under a CC BY-NC-SA license and was authored, remixed, and/or curated by Eric Lehman, F. Thomson Leighton, & Alberty R. Meyer (MIT OpenCourseWare) .