[gradsusr] Data download help
James T. Potemra
jimp at hawaii.edu
Tue Jul 9 13:49:33 EDT 2013
Emily:
The files in the URL below are served via ftp, so this is not really a
GrADS issue. Instead, you can use "wget" to retrieve all the files,
then "ncrcat" to concatenate them all together. For example,
wget -r -A.nc
ftp://ftp.cdc.noaa.gov/Datasets/ncep.reanalysis.dailyavgs/surface/
will retrieve all the files on that page that end with ".nc". Next,
ncrcat -h air.sig995.*.nc one_big_file.nc
will cat all the individual files into one big file. "ncrcat" is part
of the netCDF Operator (NCO) toolkit; more info at
http://nco.sourceforge.net/
Jim
On 7/9/13 7:09 AM, Emily Wilson wrote:
> Hello All,
>
> I am wanting to write a GrADS script in emacs that downloads a bunch
> of NetCDF files from a website and compresses them into one large
> file. The website is
> http://www.esrl.noaa.gov/psd/cgi-bin/db_search/DBListFiles.pl?did=33&tid=38147&vid=668
> and I want to download all of the NetCDF files on this page. What
> commands can I use to do this task? If this is not possible would it
> be possible to format a script so that from the webpage, each file is
> read individually and specific data is pulled in each file, wrote or
> saved to a master file then the same thing is done for the next file
> and so on?
> Thanks for the help in advance,
> *
> Emily P. Wilson, Intern*
> Research and Conservation Department
> Denver Botanic Gardens
> 1007 York St.
> Denver, CO 80206
> 720-865-3593
>
>
> _______________________________________________
> gradsusr mailing list
> gradsusr at gradsusr.org
> http://gradsusr.org/mailman/listinfo/gradsusr
-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://gradsusr.org/pipermail/gradsusr/attachments/20130709/f7ba8fd1/attachment-0003.html
More information about the gradsusr
mailing list