# What is Numerical Stability?

Numerical stability concerns how errors introduced during the execution of an algorithm affect the result. It is a property of an algorithm rather than the problem being solved. I will assume that the errors under consideration are rounding errors, but in principle the errors can be from any source.

Consider a scalar function $y = f(x)$ of a scalar $x$. We regard $x$ as the input data and $y$ as the output. The forward error of a computed approximation $\widehat{y}$ to $y$ is the relative error $|y - \widehat{y}|/|y|$. The backward error of $\widehat{y}$ is

$\min\left\{\, \displaystyle\frac{|\Delta x|}{|x|}: \widehat{y} = f(x+\Delta x) \,\right\}.$

If $\widehat{y}$ has a small backward error then it is the exact answer for a slightly perturbed input. Here, “small’ is interpreted relative to the floating-point arithmetic, so a small backward error means one of the form $cu$ for a modest constant $c$, where $u$ is the unit roundoff.

An algorithm that always produces a small backward error is called backward stable. In a backward stable algorithm the errors introduced during the algorithm have the same effect as a small perturbation in the data. If the backward error is the same size as any uncertainty in the data then the algorithm produces as good a result as we can expect.

If $x$ undergoes a relative perturbation of size $u$ then $y = f(x)$ can change by as much as $\mathrm{cond}_f(x) u$, where

$\mathrm{cond}_f(x) = \lim_{\epsilon\to0} \displaystyle\sup_{|\Delta x| \le \epsilon |x|} \displaystyle\frac{|f(x+\Delta x) - f(x)|}{\epsilon|f(x)|}$

is the condition number of $f$ at $x$. An algorithm that always produces $\widehat{y}$ with a forward error of order $\mathrm{cond}_f(x) u$ is called forward stable.

The definition of $\mathrm{cond}_f(x)$ implies that a backward stable algorithm is automatically forward stable. The converse is not true. An example of an algorithm that is forward stable but not backward stable is Gauss–Jordan elimination for solving a linear system.

If $\widehat{y}$ satisfies

$\widehat{y} + \Delta y = f(x+\Delta x), \quad |\Delta y| \le \epsilon |\widehat{y}|, \quad |\Delta x| \le \epsilon |x|,$

with $\epsilon$ small in the sense described above, then the algorithm for computing $y$ is mixed backward–forward stable. Such an algorithm produces almost the right answer for a slightly perturbed input. The following diagram illustrates the previous equation.

With these definitions in hand, we can turn to the meaning of the term numerically stable. Depending on the context, numerical stability can mean that an algorithm is (a) backward stable, (b) forward stable, or (c) mixed backward–forward stable.

For some problems, backward stability is difficult or impossible to achieve, so numerical stability has meaning (b) or (c). For example, let $Z = xy^T$, where $x$ and $y$ are $n$-vectors. Backward stability would require the computed $\widehat{Z}$ to satisfy $\widehat{Z} = (x+\Delta x)(y+\Delta y)^T$ for some small $\Delta x$ and $\Delta y$, meaning that $\widehat{Z}$ is a rank-$1$ matrix. But the computed $\widehat{Z}$ contains $n^2$ independent rounding errors and is very unlikely to have rank $1$.

We briefly mention some other relevant points.

• What is the data? For a linear system $Ax = b$ the data is $A$ or $b$, or both. For a nonlinear function we need to consider whether problem parameters are data; for example, for $f(x) = \cos(3x^2+2)$ are the $3$ and the $2$ constants or are they data, like $x$, that can be perturbed?
• We have measured perturbations in a relative sense. Absolute errors can also be used.
• For problems whose inputs are matrices or vectors we need to use norms or componentwise measures.
• Some algorithms are only unconditionally numerically stable, that is, they are numerically stable for some subset of problems. For example, the normal equations method for solving a linear least squares problem is forward stable for problems with a large residual.

## References

This is a minimal set of references, which contain further useful references within.