Bill,
Yes, I agree, but my point was one goes to the Intel chips to see what
their capabilities might be.
However, in computation the architecture makes a difference on whether one
can expect a convergence in anyone's lifetime, (ie, if the problem is even
computable).
Its impossible to use my TI-83 calculator to do even small probability
calculations. (The excitation of helium atoms from ground state to 1
level higher at 300K is beyond the abilities of the calculator).
I hope most topics in science are germane to the list?
Cheers,
Dave C
On Tue, Sep 15, 2009 at 10:00 AM, wjp <wjp@swcp.com> wrote:
> Dave:
>
> I think we've talked about this before.
> What he means, I think, is that the codes produce less variance (e.g.,
> less sensitivity to initial conditions) on "long" time scales than for
> shorter ones. That is, the code is more stable.
>
> This is slightly, at least, contrary to my experience. My experience would
> be with shocked radiation hydrodynamics. And we would generally expect
> better
> results for shorter time scales, (e.g., courant constraints).
>
> I wonder if longer time scales are being time averaged in some sense.
>
> Larger mesh size often produces "better" results because of spatial
> averaging,
> although by missing much detail. But generally we don't think of longer
> time
> scales as temporal averaging.
>
> What do you think?
>
> I would think that the efficient ability to change the number of
> bits used to represent reals would be chip dependent. I know the old
> Crays did this. I think you are right about SGI. But things change so
> fast. Who knows what their capabilities are now.
>
> bill
>
> On Tue, 15 Sep 2009 08:36:58 -0400, Dave Wallace <
> wmdavid.wallace@gmail.com> wrote:
> > Rich Blinne wrote:
> >>
> >>
> >>
> >>
> >> The study used CCSM3 (Common Climate System Model 3). This is an open
> >> source model so that differential studies can do true apples to apples
> >> comparisons. As with all weather and climate circulation models CCSM3
> >> solves Navier-Stokes fluid flow equations which as you probably know
> >> is notoriously sensitive to proper initial conditions. When used in
> >> climate modeling the approach used by thermodynamics is used in order
> >> to get useful results far into the future, averaging. The model is run
> >> multiple times, with slightly different initial conditions, and then
> >> an average is computed. The spread of the runs gives a sense of how
> >> accurate the results are.
> > Another good way is to run the models with floating point precision
> > doubled for example 128bit float rather than 64 bit float. I realise
> > that not many processors support 128 bit float but some do, Power PC/AIX
> > does and I can't remember but possibly Sun, Alpha or SGI as well.
> >
> >> Climate models are counter-intuitive in that they are less accurate on
> >> shorter time scales than on longer ones. We want them accurate at
> >> *all* time scales, short, medium, and long.
> >>
> > How do we know they are more accurate on longer scales? How long is
> > longer?
> >
> > Dave W
> >
> >
> > To unsubscribe, send a message to majordomo@calvin.edu with
> > "unsubscribe asa" (no quotes) as the body of the message.
>
>
> To unsubscribe, send a message to majordomo@calvin.edu with
> "unsubscribe asa" (no quotes) as the body of the message.
>
To unsubscribe, send a message to majordomo@calvin.edu with
"unsubscribe asa" (no quotes) as the body of the message.
Received on Tue Sep 15 12:40:03 2009
This archive was generated by hypermail 2.1.8 : Tue Sep 15 2009 - 12:40:03 EDT