[gradsusr] Using a calculation against all previous hours in current hour.
Christopher Gilroy
chris.gilroy at gmail.com
Tue Oct 20 19:53:13 EDT 2015
Hey Jeff (and Jim),
Thanks for the replies!
I download and process all plots one at a time because I download them in
real-time, as they become available. Not all files are available at the
same time, unless ftpprd's NOMADS server is slow to replicate the data,
heh. Everything works great having something (again simple, but still the
majority jist of the code) like this:
'open /gfsmodeldata/gfs.ctl'
'set dfile 1'
'set tonsofoptions'
'set tonsofoptions1'
'set tonsofoptions2'
if (t=2)
'set t 2'
'define t2m06 = tmp2m-273.15'
'define t2mtotal= t2m06'
'd t2mtotal'
endif
if (t=3)
'set t 2'
'define t2m06 = tmp2m-273.15'
'set t 3'
'define t2m12 = tmp2m-273.15'
'define t2mtotal= t2m06+t2m12'
'd t2mtotal'
endif
if (t=4)
'set t 2'
'define t2m06 = tmp2m-273.15'
'set t 3'
'define t2m12 = tmp2m-273.15'
'set t 4'
'define t2m18 = tmp2m-273.15'
'define t2mtotal= t2m06+t2m12+t2m18'
'd t2mtotal'
endif
'set string 1 tl 0 0'
'set strsiz 0.13'
'draw string 0.4 8.35 Sample String'
But as I said I was just curious if there was an easier way than having to
have a huge IF block to do it. I'm going to play with Jims example since
that may very well work after I play with it.
On Tue, Oct 20, 2015 at 7:28 PM, Jeff Duda <jeffduda319 at gmail.com> wrote:
> If you're trying to process the files one-by-one (and I'm assuming each
> file represents only one time step), then I don't see a simple way of
> integrating over time as you desire. Is there a specific reason you are not
> doing a batch download (of all the files first)? Also, if you really are
> going to do this sequentially, I would think you'd have to include the
> shell command prompt (!) in your script so that you could create a control
> file and open it as you go through the times. I didn't see that in your
> code example.
>
> Basically, what I'm trying to say is, coding wise, it would be much easier
> if you just downloaded all the files first, then processed them all at
> once. If you absolutely must process the files sequentially, I think you're
> going to be stuck hard coding each and every step.
>
> Jeff Duda
>
> On Tue, Oct 20, 2015 at 2:24 PM, James T. Potemra <jimp at hawaii.edu> wrote:
>
>> Christopher,
>>
>> How about this, e.g., for the first 10 time steps:
>>
>> 'set t 1 10'
>> 'define t2mtotal = sum ( tmp2m-273.15, t=1, t+0 )'
>> 'd t2mtotal'
>>
>> Jim
>>
>>
>> On 10/20/15 3:42 AM, Christopher Gilroy wrote:
>>
>> Ok, so I know I can 'easily' do this via a massive if block but I'm
>> trying to avoid that, and I have been. I have a calculation that will
>> return a unique value depending on which t it's running on.
>>
>> So when I get to t=10 I need to run a calculation on data contained in
>> t=1, t=2, t=3, etc. t=20 would need to run the calculation t=1, t=2, t=3,
>> etc up to t=20. The actual value of of each t would be the same for the
>> entire run if that helps to understand though.
>>
>> So take this horrible example:
>>
>> 'define t2m = tmp2m-273.15'
>> 'd t2m+allpreviousforecasthourt2ms'
>>
>> Now, the other key is that I don't download all files and then run in one
>> big loop. I download each file one-by-one and process one-by-one, in
>> real-time, as they are released.
>>
>> A real snippet of how I'm currently achieving this would look like:
>>
>> if (t=2)
>> 'set t 2'
>> 'define t2m06 = tmp2m-273.15'
>>
>> 'define t2mtotal= t2m06'
>> 'd t2mtotal'
>> endif
>>
>> if (t=3)
>> 'set t 2'
>> 'define t2m06 = tmp2m-273.15'
>>
>> 'set t 3'
>> 'define t2m12 = tmp2m-273.15'
>>
>> 'define t2mtotal= t2m06+t2m12'
>> 'd t2mtotal'
>> endif
>>
>> if (t=4)
>> 'set t 2'
>> 'define t2m06 = tmp2m-273.15'
>>
>> 'set t 3'
>> 'define t2m12 = tmp2m-273.15'
>>
>> 'set t 4'
>> 'define t2m18 = tmp2m-273.15'
>>
>> 'define t2mtotal= t2m06+t2m12+t2m18'
>> 'd t2mtotal'
>> endif
>>
>>
>>
>> So, whether t2m06 is ran during hour 06, 12, 18, 240, etc t2m06 is going
>> to be the exact same value. As you can see, the if blocks get exponentially
>> bigger each t and that's what concerns me in terms of grads memory limits
>> or if there's simply a better/faster approach.
>>
>>
>> _______________________________________________
>> gradsusr mailing listgradsusr at gradsusr.orghttp://gradsusr.org/mailman/listinfo/gradsusr
>>
>>
>>
>> _______________________________________________
>> gradsusr mailing list
>> gradsusr at gradsusr.org
>> http://gradsusr.org/mailman/listinfo/gradsusr
>>
>>
>
>
> --
> Jeff Duda
> Graduate research assistant
> University of Oklahoma School of Meteorology
> Center for Analysis and Prediction of Storms
>
> _______________________________________________
> gradsusr mailing list
> gradsusr at gradsusr.org
> http://gradsusr.org/mailman/listinfo/gradsusr
>
>
--
-Chris A. Gilroy
-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://gradsusr.org/pipermail/gradsusr/attachments/20151020/79e78b10/attachment-0001.html
More information about the gradsusr
mailing list