Math 371 - Alexiades
Conditioning and stability
We can think of solving a problem (calculations) with input (data) x,
as finding the value f(x) of some "function" (problem) at x.
Crucial question in solving any problem:
How do errors propagate into the solution of a problem?
(assuming exact arithmetic)
Question: If X ≈ x to k significant digits, will
f(X) ≈ f(x) to about k digits (assuming exact arithmetic) ?
or, in terms of relative errors:
Question: If |ρX| = | (x−X) / x | ≤ ε (small),
will |ρf(X)| = | (f(x)−f(X)) / f(x) |
also be small ? (assuming exact arithmetic)
Note: We are concerned with conditioning of the
problem itself, not of any algorithm used in solving it.
Issue is error propagation.
Better posed:
Question: What is the magnification factor κ
in the relative error:
|ρf(X)| ≈ κ |ρX| ?
Definition:
The condition number κ of a problem is the
magnification factor in the relative error ρf(X)
of the output to that of the input ρX:
κ = | ρf(X) / ρX |
Thus, each 1% change (error) in x results in κ% change (error)
in f(X).
Definition:
A problem f is called well-conditioned
(insensitive to data errors)
if small (relative) changes in data
result in small relative changes in solution,
i.e. if κ ∼ 1.
A problem f is called ill-conditioned
(sensitive to data errors)
if small (relative) changes in data
can result in large relative changes in solution,
i.e. if κ >> 1.
Theorem:
If f(x) is a differentiable function then
κ = | x f '(x)/f(x) | ,
so κ is proportional to size of x and size of f '(x),
and inversely proportional to size of f(x).
Clearly, ill-conditioned problems are bad...
A good algorithm should perform well on a well-conditioned problem.
But even a good algorithm may go bad on ill-conditioned problems...
The analogue of problem conditioning for an algorithm is usually
called stability.
An algorithm is called stable if it is insensitive
to errors made during the computation.
Else called unstable (essentially useless...)
Stability is a core subject in Numerical Analysis.