Errors

1. Absoulte Error and Relative Error

  • Absolute error == apporoimation - true value
  • Relative error == absolute error // true value
  • Precision is about the number of digits.
  • Accuracy is about the number of correct significant digits.
  • if π\pi is approximated to 3.2526037646908043.252603764690804, then it is a highly precise number, but not accurate.

2. Data Error and Computational Error

  • For a function f:RRf: \R \to \R, let and input xx be the true value, f(x)f(x) the true result. But we only know the approximate value x^\hat{x}, not the true value xx. In addition, we can only calculate f^\hat{f}, the approximation of ff. Then the total error ee is
e=f^(x^)f(x)=(f^(x^)f(x^))+(f(x^)f(x))=computational error+propagated data error\begin{aligned} e &= \hat{f}(\hat{x}) - f(x) = (\hat{f}(\hat{x}) - f(\hat{x})) + (f(\hat{x}) - f(x)) \\ &= \text{computational error} + \text{propagated data error} \end{aligned}
  • The computational error comes from the difference between the true and approximation functions about the same value. The propagated data error comes from the difference between the true and approximation values about the same function.
  • The computational error can be divided by the truncation error and rounding error.
  • Suppose that sin(π/8)\sin(\pi/8) is approximate to 0.37500.3750 from sin(π/8)sin(3/8)3/8=0.3750\sin(\pi/8) \approx \sin(3/8) \approx 3/8=0.3750. From the first term, the values are different about the same function, so it represents the propagated data error. From the second term, the functions are changed f(x)=sin(x)f(x) = \sin(x) to f(x)=xf(x) = x about the same value, so it represents the computational error.

3. Forward Error and Backward Error

  • Assume that yy is the output from the solution ff of an input xx. Then Δx=x^x\Delta x = \hat{x} - x is the backward error where x^\hat{x} is the approximation of xx. Δy=y^y\Delta y = \hat{y} - y is the forward error wherer y^=f(x^)\hat{y} = f(\hat{x}).

  • As Δy\vert \Delta y \vert is very small, it is said that the original problem xx is well estimated by the nearby problem x^\hat{x}, and that the approximation solution y^\hat{y} is good enough.
  • As an approximation to y=f(x)=cos(x)y = f(x) = \cos(x) for x=1x = 1, let y^=f^(x)=1x2/2!\hat{y} = \hat{f}(x) = 1 - x^2/2! since cos(x)=1x2/2!+x4/4!x6/6!+\cos(x) = 1 - x^2/2! + x^4/4! - x^6/6! + \cdots. Then the forward and backward errors are as follows:
Δy=y^y=1x22!cos(x)=112cos(1)    forward errorΔx=x^x=f1(y^)x=arccos(112)1    backward error\begin{aligned} \Delta y &= \hat{y} - y = 1 - \frac{x^2}{2!} - \cos(x) = 1 - \frac{1}{2} - \cos(1) \implies \text{forward error} \\\\ \Delta x &= \hat{x} - x = f^{-1}(\hat{y}) - x = \arccos \left( 1 - \frac{1}{2} \right) - 1 \implies \text{backward error} \end{aligned}

4. Condition Number

  • When f(x)f(x) changes reasonably as xx changes, it is said that the problem is insensitive, or well-conditioned.
  • When f(x)f(x) changes much more largely as xx changes, it is said that the problem is sensitive, or ill-conditioned.
  • The condition number of a problem denotes the ratio of the relative change in the solution to the relative change in the input.
condition number=(f(x^)f(x))/f(x)(x^x)/x=(y^y)/y(x^x)/x=Δy/yΔx/x=relative forward errorrelative backward error=amplication factor\begin{aligned} \text{condition number} &= \frac{\vert (f(\hat{x}) - f(x)) / f(x) \vert}{\vert (\hat{x} - x) / x \vert} = \frac{\vert (\hat{y} - y) / y \vert}{\vert (\hat{x} - x) / x \vert} = \frac{\vert \Delta y / y \vert}{\vert \Delta x / x \vert} \\\\ &= \frac{\text{relative forward error}}{\text{relative backward error}} = \text{amplication factor} \end{aligned}
  • The condition number, which is usually what we do not know, varies with the input.
  • if x^\hat{x} is close enough to xx, Δx\Delta x goes to 00 and x^=x+Δx\hat{x} = x + \Delta x.
absolute forward error=f(x+Δx)f(x)f(x)Δxrelative forward error=f(x+Δx)f(x)f(x)f(x)Δxf(x)condition numberf(x)Δx/f(x)Δx/x=xf(x)f(x)\begin{aligned} \text{absolute forward error} &= f(x +\Delta x) - f(x) \approx f'(x) \Delta x \\\\ \text{relative forward error} &= \frac{f(x + \Delta x) - f(x)}{f(x)} \approx \frac{f'(x) \Delta x}{f(x)} \\\\ \text{condition number} &\approx \left\vert \cfrac{f'(x) \Delta x / f(x)}{\Delta x / x} \right\vert = \left\vert \cfrac{x f'(x)}{f(x)} \right\vert \end{aligned}

Reference

[1] Michael T. Heath, Scientific Computing: An Introductory Survey. 2nd Edition, McGraw-Hill Higher Education


© 2024. All rights reserved.