Could not retrieve add offset. RH7.3

Warren Tennant tennant at WEATHERSA.CO.ZA
Thu May 25 01:42:31 EDT 2006


Hi Paul

There have been several posts about "segmentation faults". We have
experienced them a lot on our RedHat systems (8.0, 9.0, Fedora core
2,3,4 and even Centos 4.x). The problem is widespread but rather
elusive. However, it seems the sdfopen with a template option is the
quickest way of "fishing" for a segmentation fault. We had a post last
year from Kilian Hagemann who suggested a memory leak (see below my
email) ...

I must admit that we too have stayed with the pre-compiled binaries as
compiling from source requires a little more intellect than I posses,
despite the noble attempts by Patrice Dumas to point us all in the right
direction.

My solution for now: run the scripts in smaller chunks to avoid the seg
fault, and wait for those who know more about C and Linux to solve the
issue.  Another thought I had: does the stack limits controlled by the
"ulimit" command have any bearing on this? Perhaps the small stack size
(by default in RedHat) constrains something.

Regards to all
Warren

Kilian Hagemann wrote:
******************

Hi there,

I wrote a little script to check for global max/min values of NCEP data from
1993-2004 when I noticed that when it got further and further my system
started thrashing. Initial memory requirement: About 16MB, after every
further year of averaging an additional ~10MB and towards the end of the
script a whopping 780MB!!!! Smells like memory leak/improper pointer
management to me...

The script is attached, and as long as you have the NCEP netcdf data somewhere
available (change the ncepPath variable in the script) it should run fine.

I built 1.9b3 from source using all the extra components except the dods
interface. But that shouldn't make a difference. Let me know if you need any
further information.

Regards,

--
Kilian Hagemann

Climate Systems Analysis Group
University of Cape Town
Republic of South Africa
Tel(w): ++27 21 650 2748

------------------------------------------------------------------------

#!/usr/local/grads/bin/grads -lbc
* Copyright 2004 by Kilian Hagemann
*
* Script to extract relevant NCEP data for SOM analysis to find an optimally representative 12 month period for MM5 wind simulation
* WARNING: The min/max detection for automatic scaling works ONLY with a patched version of grads (setting mfcmn.warnflg to 0 in grads.c in version 1.9b3), otherwise the min() and max() functions output a whole lot of garbage which chokes the script

ncepPath = '/obs-a/ncep/ncep.new/'
startYear = 1993
endYear = 2004

var.1 = uwnd
var.2 = vwnd
var.3 = hgt
var.4 = omega
var.5 = rhum
var.6 = shum
var.7 = air

*air.sig995
*omega.sig995
*pr_wtr.eatm
*pres.sfc
*rhum.sig995
*slp.sig995
*uwnd.sig995
*vwnd.sig995

nvars = 7

* first find the global maximum and minimum values
* uncomment following while loop to find these
nvar = 1
while (nvar <= nvars)
    global_max = -9999
    global_min = 9999
    year = startYear
    while (year <= endYear)
        'sdfopen 'ncepPath%var.nvar'.'year'.nc'
* the dimension arguments in the following have been manually determined
        'd max(max(max('var.nvar',x=7,x=15),y=22,y=28), t=1, t=1460)'
        max = subwrd(result, 4)
        if (max > global_max)
            global_max = max
        endif
        'd min(min(min('var.nvar',x=7,x=15),y=22,y=28), t=1, t=1460)'
        min = subwrd(result, 4)
        if (min < global_min)
            global_min = min
        endif
        say '   In year 'year' min/max 'var.nvar' are 'min'/'max' over the selected domain'
        'close 1'
        year = year + 1
    endwhile
    say
    say 'Global min/max 'var.nvar' are 'global_min'/'global_max' over the selected domain'
    say
    nvar = nvar + 1
endwhile


'quit'

******************************************************************

Willis, Paul ENV:EX wrote:

>My posting below was premature.  I am still experiencing the same problems
>when attempting to use the TEMPLATE option with the DDF.  Jennifer and
>Diane have been kindly working with me off-listserve but without
>resolution.
>
>here is what I know:
>
>My environment:     linux RH7.3
>
>GrADS install:      I reinstalled with pre-compiled binary exe.  no change.
>
>Data file corruption:  I downloaded a fresh set of data files.  no change.
>
>Hoop's 'sdfopen' merge:  see previous thread.  While this resolves the
>offset and missing value warnings, the environment is unstable and core
>dumps (segmentation faults) make it unworkable.
>
>My questions du-jour:
>
>1) Has anyone had compatibility problems running rh7.3?
>
>2) Jennifer has 'grepped' GrADS source code and has not found the source
>of this warning:
>
>"-- using -1e+15 insteade data type for "missing_value"
>
>Does this look familiar to anyone.  note, I have not messed with the GrADS
>source code on my machine.
>
>3)  Suggestions for further debugging of either the offset problem when
>using the DDF, or with segmentation faults when using the 'sdfopen' merge?
>
>
>Thanks.
>
>Paul Willis
>
>
>
>On Sun, 21 May 2006 01:25:35 +0200, Willis, Paul ENV:EX
><paul.willis at GOV.BC.CA> wrote:
>
>
>
>>Jennifer, et al.,
>>
>>As it turns out, the original descriptor file that I sent was fine, as you
>>and Diane have determined.  Upon my debugging of another problem, I
>>discovered I had incorrectly defined my $GADDIR environment variable.  I
>>have since tested the descriptor file without problem.
>>
>>I am a GrADS newbie and have a lot to learn.  My apologies that you all
>>spent some time on what turned out to be pilot error.
>>
>>By the way, I have successfully followed Hoop's 'sdfopen'.  However, yes,
>>I have experienced a number of core dumps.  I am too inexperienced to make
>>an assessment; could be nothing to do with this method and more to some
>>other misconfiguration on my part.   But I will try the two different
>>methods over the course of the next two weeks and may have some comments
>>of value.
>>
>>Thanks.
>>
>>-pw
>>
>>
>>On Sat, 20 May 2006 16:14:58 -0400, Jennifer M. Adams <jma at COLA.IGES.ORG>
>>wrote:
>>
>>
>>
>>>Dear Paul, et al.,
>>>OK, so you could read this file with sdfopen and do it the way Hoop
>>>suggests. But if templating with sdfopen often leads to core dumps on
>>>your operating system, then it's perfectly reasonable to do it with a
>>>'dtype netcdf' descriptor. User's choice. My greater concern is why
>>>the 'dtype netcdf' interface isn't retrieving the 'add_offset' or the
>>>'missing_value' attributes properly for you.
>>>
>>>I downloaded a NetCDF file from CDC -- daily means of SLP for 1999
>>>with the identical ncdump output -- and used your descriptor file
>>>with this data set and it read the slp data on my mac without any
>>>problems at all:
>>>
>>>ga-> open /data/netcdf/rean/slp.ctl
>>>Scanning description file:  /data/netcdf/rean/slp.ctl
>>>Data file /data/netcdf/rean/slp.%y4.nc is open as file 1
>>>LON set to 0 360
>>>LAT set to -90 90
>>>LEV set to 1 1
>>>Time values set: 1999:1:1:0 1999:1:1:0
>>>ga-> q file
>>>File 1 : NCEP Reanalysis Daily Averages
>>>  Descriptor: /data/netcdf/rean/slp.ctl
>>>  Binary: /data/netcdf/rean/slp.%y4.nc
>>>  Type = Gridded
>>>  Xsize = 144  Ysize = 73  Zsize = 1  Tsize = 731
>>>  Number of Variables = 1
>>>    slp 0 -103 mean Daily Sea Level Pressure [Pa]
>>>ga-> d slp
>>>Contouring: 97000 to 105000 interval 1000
>>>ga->
>>>
>>>
>>>The attributes of this file are exactly the same, so ... why doesn't
>>>it work for you???
>>>
>>>On May 18, 2006, at 8:42 PM, Willis, Paul ENV:EX wrote:
>>>
>>>
>>>>Running GrADS 1.9b4 on linux.
>>>>
>>>>
>>>Which executable are you using:  gradsnc, gradshdf, or gradsdods?
>>>Which linux: RH7.1, RH9, or RHE3?
>>>
>>> Jennifer
>>>
>>>
>>>
>>>>Thanks in advance for any guidance.
>>>>
>>>>-paul
>>>>*********************
>>>>
>>>>
>>>>ga-> q dim
>>>>
>>>>Default file number is: 1
>>>>
>>>>X is varying   Lon = 0 to 360   X = 1 to 145
>>>>
>>>>Y is varying   Lat = -90 to 90   Y = 0.999999 to 73
>>>>
>>>>Z is fixed     Lev = 1  Z = 1
>>>>
>>>>T is fixed     Time = 00Z01JAN1999  T = 1
>>>>
>>>>ga-> d slp
>>>>"  -- using -1e+15 insteade data type for "missing_value
>>>>" not handled
>>>>-- using -1e+15 instead
>>>>
>>>>Warning: Could not retrieve add offset data type -- setting add
>>>>offset to
>>>>0.0
>>>>
>>>>Contouring: -20000 to -16500
>>>> interval 500
>>>>
>>>>*********************
>>>>here is my descriptor file
>>>>*********************
>>>>dset /home/pfwillis/2006/Grads/circulation/Climate/data/slp_daily.%
>>>>y4.nc
>>>>dtype netcdf
>>>>options yrev template
>>>>undef -1e15 missing_value
>>>>unpack scale_factor add_offset
>>>>title NCEP Reanalysis Daily Averages
>>>>xdef 144 linear 0 2.5
>>>>ydef 73 linear -90 2.5
>>>>zdef   1 linear 1 1
>>>>tdef 731 linear 00Z01jan1999 24hr
>>>>vars 1
>>>>slp 0 t,y,x mean Daily Sea Level Pressure [Pa]
>>>>endvars
>>>>
>>>>
>>>>*********************
>>>>should an ncdump provide useful information on this, see below.
>>>>*********************
>>>>
>>>>
>>>>netcdf slp_daily.1999 {
>>>>dimensions:
>>>> lon = 144 ;
>>>> lat = 73 ;
>>>> time = UNLIMITED ;
>>>> // (365 currently)
>>>>variables:
>>>> float lat(lat) ;
>>>>  lat:units = "degrees_north" ;
>>>>
>>>>  lat:actual_range = 90.f, -90.f ;
>>>>  lat:long_name = "Latitude" ;
>>>>
>>>> float lon(lon) ;
>>>>  lon:units = "degrees_east" ;
>>>>
>>>>  lon:long_name = "Longitude" ;
>>>>
>>>>  lon:actual_range = 0.f, 357.5f ;
>>>> double time(time) ;
>>>>
>>>>  time:units = "hours since 1-1-1 00:00:0.0" ;
>>>>
>>>>  time:long_name = "Time" ;
>>>>  time:actual_range = 17514144., 17522880. ;
>>>>
>>>>  time:delta_t = "0000-00-01 00:00:00" ;
>>>>
>>>>time:avg_period = "0000-00-01 00:00:00" ;
>>>> short slp(time, lat, lon) ;
>>>>
>>>>slp:long_name = "mean Daily Sea Level Pressure" ;
>>>>
>>>>  slp:valid_range = 87000.f, 115000.f ;
>>>>
>>>>  slp:actual_range = 92970.f, 110552.f ;
>>>>
>>>>  slp:units = "Pascals" ;
>>>>
>>>>  slp:add_offset = 119765.f ;
>>>>
>>>>  slp:scale_factor = 1.f ;
>>>>
>>>>  slp:missing_value = 32766s ;
>>>>
>>>>  slp:precision = 0s ;
>>>>
>>>>  slp:least_significant_digit = -1s ;
>>>>
>>>> slp:GRIB_id = 2s ;
>>>>
>>>>  slp:GRIB_name = "PRMSL" ;
>>>>
>>>>  slp:var_desc = "Sea Level Pressure\n",
>>>>   "P" ;
>>>>
>>>>  slp:dataset = "NCEP Reanalysis Daily Averages\n",
>>>>   "AJ" ;
>>>>
>>>>  slp:level_desc = "Sea Level\n",
>>>>   "I" ;
>>>>
>>>>  slp:statistic = "Mean\n",
>>>>   "M" ;
>>>>
>>>>  slp:parent_stat = "Individual Obs\n",
>>>>   "I" ;
>>>>
>>>>
>>>>// global attributes:
>>>>  :Conventions = "COARDS" ;
>>>>
>>>>  :title = "mean daily NMC reanalysis (1999)" ;
>>>>
>>>>  :base_date = 1999s, 1s, 1s ;
>>>>
>>>>  :history = "created 99/01/04 by Hoop (netCDF2.3)" ;
>>>>
>>>>  :description = "Data is from NMC initialized reanalysis\n",
>>>>
>>>>   "(4x/day).  It consists of most variables
>>>>interpolated to\n",
>>>>
>>>>   "pressure surfaces from model (sigma) surfaces." ;
>>>>
>>>>  :platform = "Model" ;
>>>>
>>>>
>>>>data:
>>>>
>>>> lat = .......
>>>>
>>>>
>>>>
>>>Jennifer M. Adams
>>>IGES/COLA
>>>4041 Powder Mill Road, Suite 302
>>>Beltsville, MD 20705
>>>jma at cola.iges.org
>>>
>>>
>
>
>

-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://gradsusr.org/pipermail/gradsusr/attachments/20060525/4a88dc5d/attachment.html 


More information about the gradsusr mailing list