Alan's Unified Model file utilities

These are in, roughly speaking, decreasing order of usefulness/sophistication.

Jump to:


convancil

Converts ancillary and dump files from 32-bit -> 64-bit, or from 64-bit -> 32-bit. Also unpacks any fields which are packed with the WGDOS or Cray 32-bit methods (new functionality, 28 Jan 2004; bugfix 23 March 2004; further bugfix 21 June 2006).

Build:

Usage:

Notes:


UM file subsetting utility

Description

Subsets UM ancillary / dump / fields files by writing out a new file with only the desired PP records, but still in the UM file format. The purpose in almost all cases is to retain a smaller file. Supports files in either little- or big-endian byte ordering, and at either 32-bit or 64-bit (via different executables).

The following record selection criteria are currently supported:

Build

Usage

A command line utility. Usage message from 64-bit version follows. The 32-bit version is used identically via the other executable. (Note that the precision and byte ordering options must match the input file, and the output file will have the same; this is not the utility for converting these things. Expect to see many errors if you get this wrong.)

     subset_um_64 -- subsets records in a UM file (64-bit version)
     
     Usage: subset_um_64 [options] input_file output_file
     
     Options are:
     
        record selection options:
     
          -I  keep instantaneous fields [default]
          -P  prompt for each record
          -A  include all records (just removes unused headers, like 'umpack')
          -S SECT  include fields from stash section SECT (e.g. 0 for prognostic)
     
          (do not specify multiple criteria; only the last will have effect)
     
        byte-ordering options:
     
         -b  big-endian
         -l  little-endian
         -n  native [default]
         -s  swapped (non-native)
     
       other options:
     
         -c  clobber existing output file
         -v  be more verbose
         -h  display this help message

Notes


IDL routines for ancillary files

Sorry, these require the distinctly non-free package IDL.

Reading/writing

rdancil.pro: read an ancillary file into IDL arrays.

wrancil.pro: write an ancillary file from IDL arrays.

Usage:

     wrancil,filename, $
                fixhdr,intc,realc,levdepc,rowdepc,coldepc,fieldc,extrac, $
                temphist,cfi1,cfi2,cfi3,lookup,data, $
                [/swap],[bits=bits]

     rdancil -- same syntax

(where bits may be 64 (the default) or 32, and "/swap" is for byte-swapped files).

New: As an alternative there are versions which returns "data" as an array of pointers (pointing to an array of values for each record) rather than one long array of values. See rdancil2.pro, wrancil2.pro.

Interpreting PP header info

pparray_to_struct.pro -- generate structure, containing named fields, from "lookup" array of PP headers (requires rawconvert.pro).

struct_to_pparray.pro -- the reverse operation (requires rawconvert.pro and tagnum.pro)

showlookup.pro -- display a few key fields from lookup table.

Example: "rewind" all data times in 32-bit restart dump by 100 years

     rdancil,'xabtea.daj4310', $
                fixhdr,intc,realc,levdepc,rowdepc,coldepc,fieldc,extrac, $
                temphist,cfi1,cfi2,cfi3,lookup,data, $
                bits=32

     ;; change date in fixed-length header
     fixhdr[20]=fixhdr[20]-100
     fixhdr[27]=fixhdr[27]-100

     ;; change PP headers
     s=pparray_to_struct(lookup)
     ok=where(s.lbyr ne -32768)
     s.lbyr[ok]=s.lbyr[ok]-100
     s.lbyrd[ok]=s.lbyrd[ok]-100
     lookup=struct_to_pparray(s)

     wrancil,'xabtea.da94310', $
                fixhdr,intc,realc,levdepc,rowdepc,coldepc,fieldc,extrac, $
                temphist,cfi1,cfi2,cfi3,lookup,data, $
                bits=32

Making heat convergence ancillary files

A program to do this, built on rdancil.pro and wrancil.pro above, is documented on a separate page.

Zonal meaning and spreading

zonal_ancil.pro: write an ancillary file based on input ancillary, but with either 2D fields spread to 3D, or with 3D fields zonally averaged to 2D (depending on options used).

Usage: see comment lines at top of script.

(NB requires all above scripts.)


Filename date code-handling utilities

umtodate / datetoum / sortum / umdate

umtodate reports the date corresponding to a UM filename,
umdate (new) reports the dates in a UM ancillary file header,
sortum sorts a list of UM filenames, with correct chronological order for monthly or seasonal mean files,
datetoum turns a date into a UM date code.

Download: umtodate, umdate, datetoum, sortum (and add execute permissions).

Usage:

Notes:

Example:

   % ls
   xabbqa.dah8cg0  xabbqa.dah9210     xabbqa.pmh9apr.nc  xabbqa.pmh9mar.nc
   xabbqa.dah9110  xabbqa.dah92g0     xabbqa.pmh9feb.nc  xabbqa.pmh9may.nc
   xabbqa.dah91g0  xabbqa.pmh8dec.nc  xabbqa.pmh9jan.nc
   
   % sortum *
   xabbqa.dah8cg0
   xabbqa.dah9110
   xabbqa.dah91g0
   xabbqa.dah9210
   xabbqa.dah92g0
   xabbqa.pmh8dec.nc
   xabbqa.pmh9jan.nc
   xabbqa.pmh9feb.nc
   xabbqa.pmh9mar.nc
   xabbqa.pmh9apr.nc
   xabbqa.pmh9may.nc
   
   % ls | sortum | umtodate 
   xabbqa.dah8cg0: 00:00, 16 Dec 1978
   xabbqa.dah9110: 00:00,  1 Jan 1979
   xabbqa.dah91g0: 00:00, 16 Jan 1979
   xabbqa.dah9210: 00:00,  1 Feb 1979
   xabbqa.dah92g0: 00:00, 16 Feb 1979
   xabbqa.pmh8dec.nc: dec 1978
   xabbqa.pmh9jan.nc: jan 1979
   xabbqa.pmh9feb.nc: feb 1979
   xabbqa.pmh9mar.nc: mar 1979
   xabbqa.pmh9apr.nc: apr 1979
   xabbqa.pmh9may.nc: may 1979

Example 2:

  % ls
  xaaqdo@pdw14c1  xaaqdo@pdw15c1

  % umdate *
  xaaqdo@pdw14c1:  First 1991-09-01 00:00:00,  Last 3213-12-01 01:00:00,  Interval 2004-01-25 01:01:00
  xaaqdo@pdw15c1:  First 1991-09-01 00:00:00,  Last 3214-12-01 01:00:00,  Interval 2004-01-25 07:15:47

um2nc

Convert UM dumps to NetCDF

There are various versions of this wrapper to convsh "out there". This one also supports creating zonal means, and extracting a single named field. (Also uses long field names based on the stash code, as per xconv, rather than the PP code which is the convsh default.)

Download: um2nc

Usage:

(In case you still want it, here's the previous version.)


lastleave

Returns full pathname of most recent UM leave file(s) (matching given stems if specified at start of filename), and optionally extracts some information (timesteps, run time).

Download: lastleave

Usage:

Examples:


mkcrun

Edits the SUBMIT script of a processed UMUI job, changing TYPE=NRUN into TYPE=CRUN.

Download: mkcrun

Usage: mkcrun jobid


Modified xconv / convsh

This section is almost entirely redundant due to the subsequent release of later versions of xconv, but is kept here at the end of the page just in case it is of any remaining use (unlikely).

NOTE: the following files are for xconv 1.05. A better version, 1.90, is available from Jeff Cole, but is currently (Jan '04) pre-release so is not on the web. Please contact jeff AT met.rdg.ac.uk

These static executables for linux: xconv, convsh are build with this patch, which causes the "unlimited" flag to be added to the time dimension when writing NetCDF files. This permits the use of time-domain processing (e.g. concatenation), using the NCO utilities (see description and source).

Usage: as per unmodified versions.

(NB despite being static executables, requires tcl/tk version 4.3 installed.)


Last edited: 1 February 2007
Alan Iwi <A.M.Iwi@rl.ac.uk>