Return to the Index.



Down loadable data files

Raw data HSRL is processed on request. You can request the altitude range, time period, the resolution and types of data. After processing, files processed with the "File_mode=single", setting will appear as a link on the web page and remain available to download for 24 hours. Processing time is dependent on the data volume requested--for small requests(a few hours or less) data files are likely to appear within a minute of so. Other "File_Mode" selection allow creation of multiple output files with fixed time durations or at times of selected satellite overpasses. With these "File_mode" selections, data are returned via our public ftp page ftp://lidar.ssec.wisc.edu/data/ in a sub-directory using the user name you supplied. A unique computer generated file name will be created for each request. Files on the ftp page appear as they are created and opened for writing (note for large files it is possible to download a file before the processing is complete--in which case you will obtain an incomplete file). When all of the requested files are completed they are combined into a compressed "*.tar.bz2" file. All processing is done on our local machine. Please, don't abuse the system. When possible make large requests(a month or more of data) outside of our normal 8-5 CDT working hours.

NetCDF

Data files are created in the NetCDF(Network Common Data Form) format. This is a machine-independent data format that supports the creation, access, and sharing of array-oriented scientific data. After a file has been created on the web page a "DUMP" button appears on the web that can be used to display the file. The file header provides names, descriptions and units for all variables included in the file. More information and support routines for this format can be found at http://www.unidata.ucar.edu/software/netcdf.

Utility Code

Times: In the exported NetCDF file, all absolute time values are stored in two ways, either is acceptable for use in code:
- a float offset in seconds from the file's base time in seconds from the unix epoch offset. This is the COARDS standard, and quite convenient for most code. But not very human readable.
- as a vector of 8 values: Year, Month, Date, Hour, Minute, Second, Millisecond, and Microsecond. While being human readable and quite adaptable, the values are not convenient for most code. Below are pairs of utility functions for various languages to convert these vectors to native values and back. All functions convert to and from without loss, and return a native representation for the time, in GMT if the system is aware of timezones.

C as a POSIX timespec in GMT
Python in GMT Seconds (used in the time module)
Matlab as a DateNum (not timezone aware)


Batch Mode

Scripted downloads: If you want to download a NetCDF using your own local scripts or webforms, in calling our CGI, set the variable "Direct" to "on", and set your http connection timeout really high. What this does is bypass the intermediate pages, and the downloaded URL is the NetCDF file itself. This also uses the CGI "Location" header to forward from the intermediate page to the final page, and uses the intermediate page just to wait until the final page is done. The commandline utility "wget" supports this directly, while "curl" requires -L to work properly. Please, use this sparingly and only if you know what you are doing, because it can be really intense for our local server. If testing code, try small datasets first to verify it retreives datasets properly.

Return to the Index.

jpgarcia@lidar.ssec.wisc.edu