While this seems to be a common misconception (that's too strong a word - it
is technically correct in a sense), it isn't an issue in practice, or at
least not a significant one in the grand context of the uncertainty of
predictions. One can always reduce roundoff errors and their propagation by
making the numerical algorithms better, and good scientific work will always
take steps to ensure that these errors are not significant to the work.
The problem with predictions of some dynamical systems is their extreme
sensitivity to initial conditions and parameters. If we don't know a
parameter (like maybe the IR absorption behavior for a given atmospheric
species) or an initial condition (like maybe the current distribution of
that species in the atmosphere) precisely enough, those factors (not the
computing precision) will limit the ability to make predictions. Good
scientific work will also recognize these limitations.
In general, you are not going to find any real-world situation where the
uncertainty in the parameters and initial conditions is not many orders of
magnitude more than the roundoff errors in modern computers.
---------------------------------------------------------------------------
| Dr. Allan H. Harvey | aharvey@boulder.nist.gov |
| Physical and Chemical Properties Division | Phone: (303)497-3555 |
| National Institute of Standards & Technology | Fax: (303)497-5224 |
| 325 Broadway, Boulder, CO 80303 | |
|-------------------------------------------------------------------------|
| "Don't blame the government for what I say, or vice versa." |
---------------------------------------------------------------------------