BELIEVE   ME   NOT!    -     A   SKEPTIC's   GUIDE  

next up previous
Next: Statistical Analysis Up: Tolerance Previous: Graphs and Error Bars

Vector Tolerance

Allow me to slip into something a little more formal . . . .

Usually this topic would be called "Error Propagation in Functions of Several Variables" or something like that; I have used the term "vector tolerance" because (a) the word "error" has these perjorative connotations for most people, whereas "tolerance" is usually considered a good thing;5.2 (b) when our final result is calculated in terms of several other quantities, each of which is uncertain by some amount, and when those uncertainties are independent of each other, we get a situation much like trying to define the overall length of a vector with several independent perpendicular components. Each contribution to the overall uncertainty can be positive or negative, and on average you would not expect them to all add up; that would be like assuming that if one were positive they all must be. So we square each contribution, add the squares and take the square root of the sum, just as we would do to find the length of a vector from its components.

The way to do this is easily prescribed if we use a little calculus notation: suppose the "answer" A is a function of several variables, say x and y. We write A(x,y). So what happens to A when x changes by some amount $\delta x$?5.3 Simple, we just write $\delta A_x \approx
(\partial A / \partial x ) \; \delta x $ where the x subscript on $\delta A_x$ reminds us that this is just the contribution to the change in A from that little change in x, not from any changes in y; the $\approx$ sign acknowledges that this doesn't get exact until $\delta x \to dx$, which is really small; and the $\partial$ symbols are like derivatives except they remind us that we are treating y as if it were a constant when we take this derivative.

The same trick works for changes in y, of course, so then we have two "orthogonal" shifts of the result to combine into one uncertainty in A. I have already given the prescription for this above. The formula reads

\begin{displaymath}(\delta A)^2 \approx
\left( { \partial A \over \partial x } . . . 
 . . .  \left( { \partial A \over \partial y } \; \delta y \right)^2
\end{displaymath} (5.1)

This can be extended to a function of N variables $\{x_1, x_2, \cdots x_i \cdots x_N \}$:

 \begin{displaymath}(\delta A)^2 \approx \sum_{i=1}^N
\left( { \partial A \over \partial x_i } \; \delta x_i \right)^2
\end{displaymath} (5.2)

where the $\sum$ symbol means "sum over all terms of this form, with the index i running from 1 to N."

The treatment above is a little too "advanced" mathematically for some people (or for anyone on a bad day), so here are a few special cases that the enthusiast may wish to derive from the general form in Eq. (2):

These should get you through almost anything, if applied wisely.


next up previous
Next: Statistical Analysis Up: Tolerance Previous: Graphs and Error Bars
Jess H. Brewer - Last modified: Fri Nov 13 16:28:09 PST 2015