CSET Operators

This page details the operators contained within CSET. It is automatically generated from the code and its docstrings. Operators should be used via Run an Operator Recipe.

Generic Operators

CSET.operators.aggregate

Operators to aggregate across either 1 or 2 dimensions.

CSET.operators.aggregate.time_aggregate(cube: Cube, method: str, interval_iso: str, **kwargs) Cube

Aggregate cube by its time coordinate.

Aggregates similar (stash) fields in a cube for the specified coordinate and using the method supplied. The aggregated cube will keep the coordinate and add a further coordinate with the aggregated end time points.

Examples are: 1. Generating hourly or 6-hourly precipitation accumulations given an interval for the new time coordinate.

We use the isodate class to convert ISO 8601 durations into time intervals for creating a new time coordinate for aggregation.

We use the lambda function to pass coord and interval into the callable category function in add_categorised to allow users to define their own sub-daily intervals for the new time coordinate.

Parameters:
  • cube (iris.cube.Cube) – Cube to aggregate and iterate over one dimension

  • coordinate (str) – Coordinate to aggregate over i.e. ‘time’, ‘longitude’, ‘latitude’,’model_level_number’.

  • method (str) – Type of aggregate i.e. method: ‘SUM’, getattr creates iris.analysis.SUM, etc.

  • interval_iso (isodate timedelta ISO 8601 object i.e PT6H (6 hours), PT30M (30 mins)) – Interval to aggregate over.

Returns:

cube – Single variable but several methods of aggregation

Return type:

iris.cube.Cube

Raises:

ValueError – If the constraint doesn’t produce a single cube containing a field.

CSET.operators.collapse

Operators to perform various kind of collapse on either 1 or 2 dimensions.

CSET.operators.collapse.collapse(cube: Cube, coordinate: str | list[str], method: str, additional_percent: float = None, **kwargs) Cube

Collapse coordinate(s) of a cube.

Collapses similar (stash) fields in a cube into a cube collapsing around the specified coordinate(s) and method. This could be a (weighted) mean or percentile.

Parameters:
  • cube (iris.cube.Cube) – Cube to collapse and iterate over one dimension

  • coordinate (str | list[str]) – Coordinate(s) to collapse over e.g. ‘time’, ‘longitude’, ‘latitude’, ‘model_level_number’, ‘realization’. A list of multiple coordinates can be given.

  • method (str) – Type of collapse i.e. method: ‘MEAN’, ‘MAX’, ‘MIN’, ‘MEDIAN’, ‘PERCENTILE’ getattr creates iris.analysis.MEAN, etc For PERCENTILE YAML file requires i.e. method: ‘PERCENTILE’ additional_percent: 90

Returns:

cube – Single variable but several methods of aggregation

Return type:

iris.cube.Cube

Raises:

ValueError – If additional_percent wasn’t supplied while using PERCENTILE method.

CSET.operators.constraints

Operators to generate constraints to filter with.

CSET.operators.constraints.combine_constraints(constraint: Constraint = None, **kwargs) Constraint

Operator that combines multiple constraints into one.

Parameters:
  • constraint (iris.Constraint) – First constraint to combine.

  • additional_constraint_1 (iris.Constraint) – Second constraint to combine. This must be a named argument.

  • additional_constraint_2 (iris.Constraint) – There can be any number of additional constraint, they just need unique names.

  • ...

Returns:

combined_constraint

Return type:

iris.Constraint

Raises:

TypeError – If the provided arguments are not constraints.

CSET.operators.constraints.generate_area_constraint(lat_start: float, lat_end: float, lon_start: float, lon_end: float, **kwargs) Constraint

Generate an area constraint between latitude/longitude limits.

Operator that takes a set of latitude and longitude limits and returns a constraint that selects grid values only inside that area. Works with the data’s native grid so is defined within the rotated pole CRS.

Parameters:
  • lat_start (float) – Latitude value for lower bound

  • lat_end (float) – Latitude value for top bound

  • lon_start (float) – Longitude value for left bound

  • lon_end (float) – Longitude value for right bound

Returns:

area_constraint

Return type:

iris.Constraint

CSET.operators.constraints.generate_cell_methods_constraint(cell_methods: list, **kwargs) Constraint

Generate constraint from cell methods.

Operator that takes a list of cell methods and generates a constraint from that.

Parameters:

cell_methods (list) – cube.cell_methods for filtering

Returns:

cell_method_constraint

Return type:

iris.Constraint

CSET.operators.constraints.generate_model_level_constraint(model_level_number: int | str, **kwargs) Constraint

Generate constraint for a particular model level number.

Operator that takes a CF compliant model_level_number string, and uses iris to generate a constraint to be passed into the read operator to minimize the CubeList the read operator loads and speed up loading.

Parameters:

model_level_number (str) – CF compliant model level number.

Returns:

model_level_number_constraint

Return type:

iris.Constraint

CSET.operators.constraints.generate_pressure_level_constraint(pressure_levels: int | list[int], **kwargs) Constraint

Generate constraint for the specified pressure_levels.

If no pressure levels are specified then any cube with a pressure coordinate is rejected.

Parameters:

pressure_levels (int|list) – List of integer pressure levels in hPa either as single integer for a single level or a list of multiple integers.

Returns:

pressure_constraint

Return type:

iris.Constraint

CSET.operators.constraints.generate_stash_constraint(stash: str, **kwargs) AttributeConstraint

Generate constraint from STASH code.

Operator that takes a stash string, and uses iris to generate a constraint to be passed into the read operator to minimize the CubeList the read operator loads and speed up loading.

Parameters:

stash (str) – stash code to build iris constraint, such as “m01s03i236”

Returns:

stash_constraint

Return type:

iris.AttributeConstraint

CSET.operators.constraints.generate_time_constraint(time_start: str, time_end: str = None, **kwargs) AttributeConstraint

Generate constraint between times.

Operator that takes one or two ISO 8601 date strings, and returns a constraint that selects values between those dates (inclusive).

Parameters:
  • time_start (str | datetime.datetime) – ISO date for lower bound

  • time_end (str | datetime.datetime) – ISO date for upper bound. If omitted it defaults to the same as time_start

Returns:

time_constraint

Return type:

iris.Constraint

CSET.operators.constraints.generate_var_constraint(varname: str, **kwargs) Constraint

Generate constraint from variable name.

Operator that takes a CF compliant variable name string, and uses iris to generate a constraint to be passed into the read operator to minimize the CubeList the read operator loads and speed up loading.

Parameters:

varname (str) – CF compliant name of variable. Needed later for LFRic.

Returns:

varname_constraint

Return type:

iris.Constraint

CSET.operators.filters

Operators to perform various kind of filtering.

CSET.operators.filters.filter_cubes(cube: Cube | CubeList, constraint: Constraint, **kwargs) Cube

Filter a CubeList down to a single Cube based on a constraint.

Parameters:
  • cube (iris.cube.Cube | iris.cube.CubeList) – Cube(s) to filter

  • constraint (iris.Constraint) – Constraint to extract

Return type:

iris.cube.Cube

Raises:

ValueError – If the constraint doesn’t produce a single cube.

CSET.operators.filters.filter_multiple_cubes(cubes: Cube | CubeList, **kwargs) CubeList

Filter a CubeList on multiple constraints, returning another CubeList.

Parameters:
  • cube (iris.cube.Cube | iris.cube.CubeList) – Cube(s) to filter

  • constraint (iris.Constraint) – Constraint to extract. This must be a named argument. There can be any number of additional constraints, they just need unique names.

Return type:

iris.cube.CubeList

Raises:

ValueError – The constraints don’t produce a single cube per constraint.

CSET.operators.misc

Miscellaneous operators.

CSET.operators.misc.addition(addend_1, addend_2)

Addition of two fields.

Parameters:
  • addend_1 (Cube) – Any field to have another field added to it.

  • addend_2 (Cube) – Any field to be added to another field.

Return type:

Cube

Raises:

ValueError, iris.exceptions.NotYetImplementedError – When the cubes are not compatible.

Notes

This is a simple operator designed for combination of diagnostics or creating new diagnostics by using recipes.

Examples

>>> field_addition = misc.addition(kinetic_energy_u, kinetic_energy_v)
CSET.operators.misc.division(numerator, denominator)

Division of two fields.

Parameters:
  • numerator (Cube) – Any field to have the ratio taken with respect to another field.

  • denominator (Cube) – Any field used to divide another field or provide the reference value in a ratio.

Return type:

Cube

Raises:

ValueError – When the cubes are not compatible.

Notes

This is a simple operator designed for combination of diagnostics or creating new diagnostics by using recipes.

Examples

>>> bowen_ratio = misc.division(sensible_heat_flux, latent_heat_flux)
CSET.operators.misc.multiplication(multiplicand, multiplier)

Multiplication of two fields.

Parameters:
  • multiplicand (Cube) – Any field to be multiplied by another field.

  • multiplier (Cube) – Any field to be multiplied to another field.

Return type:

Cube

Raises:

ValueError – When the cubes are not compatible.

Notes

This is a simple operator designed for combination of diagnostics or creating new diagnostics by using recipes.

Examples

>>> filtered_CAPE_ratio = misc.multiplication(CAPE_ratio, inflow_layer_properties)
CSET.operators.misc.noop(x, **kwargs)

Return its input without doing anything to it.

Useful for constructing diagnostic chains.

Parameters:

x (Any) – Input to return.

Returns:

x – The input that was given.

Return type:

Any

CSET.operators.misc.remove_attribute(cubes: Cube | CubeList, attribute: str | Iterable, **kwargs) CubeList

Remove a cube attribute.

If the attribute is not on the cube, the cube is passed through unchanged.

Parameters:
  • cubes (Cube | CubeList) – One or more cubes to remove the attribute from.

  • attribute (str | Iterable) – Name of attribute (or Iterable of names) to remove.

Returns:

cubes – CubeList of cube(s) with the attribute removed.

Return type:

CubeList

CSET.operators.misc.subtraction(minuend, subtrahend)

Subtraction of two fields.

Parameters:
  • minuend (Cube) – Any field to have another field subtracted from it.

  • subtrahend (Cube) – Any field to be subtracted from to another field.

Return type:

Cube

Raises:

ValueError, iris.exceptions.NotYetImplementedError – When the cubes are not compatible.

Notes

This is a simple operator designed for combination of diagnostics or creating new diagnostics by using recipes. It can be used for model differences to allow for comparisons between the same field in different models or model configurations.

Examples

>>> model_diff = misc.subtraction(temperature_model_A, temperature_model_B)

CSET.operators.plot

Operators to produce various kinds of plots.

CSET.operators.plot.plot_line_series(cube: Cube, filename: str = None, series_coordinate: str = 'time', **kwargs) Cube

Plot a line plot for the specified coordinate.

The cube must be 1D.

Parameters:
  • cube (Cube) – Iris cube of the data to plot. It should have a single dimension.

  • filename (str, optional) – Name of the plot to write, used as a prefix for plot sequences. Defaults to the recipe name.

  • series_coordinate (str, optional) – Coordinate about which to make a series. Defaults to "time". This coordinate must exist in the cube.

Returns:

The original cube (so further operations can be applied).

Return type:

Cube

Raises:
  • ValueError – If the cube doesn’t have the right dimensions.

  • TypeError – If the cube isn’t a single cube.

CSET.operators.plot.postage_stamp_contour_plot(cube: Cube, filename: str = None, coordinate: str = 'realization', **kwargs) Cube

Plot postage stamp contour plots from an ensemble.

Depreciated. Use spatial_contour_plot with a stamp_coordinate argument instead.

Parameters:
  • cube (Cube) – Iris cube of data to be plotted. It must have a realization coordinate.

  • filename (pathlike, optional) – The path of the plot to write. Defaults to the recipe name.

  • coordinate (str) – The coordinate that becomes different plots. Defaults to “realization”.

Returns:

The original cube (so further operations can be applied)

Return type:

Cube

Raises:
  • ValueError – If the cube doesn’t have the right dimensions.

  • TypeError – If cube isn’t a Cube.

CSET.operators.plot.spatial_contour_plot(cube: Cube, filename: str = None, sequence_coordinate: str = 'time', stamp_coordinate: str = 'realization', **kwargs) Cube

Plot a spatial variable onto a map from a 2D, 3D, or 4D cube.

A 2D spatial field can be plotted, but if the sequence_coordinate is present then a sequence of plots will be produced. Similarly if the stamp_coordinate is present then postage stamp plots will be produced.

Parameters:
  • cube (Cube) – Iris cube of the data to plot. It should have two spatial dimensions, such as lat and lon, and may also have a another two dimension to be plotted sequentially and/or as postage stamp plots.

  • filename (str, optional) – Name of the plot to write, used as a prefix for plot sequences. Defaults to the recipe name.

  • sequence_coordinate (str, optional) – Coordinate about which to make a plot sequence. Defaults to "time". This coordinate must exist in the cube.

  • stamp_coordinate (str, optional) – Coordinate about which to plot postage stamp plots. Defaults to "realization".

Returns:

The original cube (so further operations can be applied).

Return type:

Cube

Raises:
  • ValueError – If the cube doesn’t have the right dimensions.

  • TypeError – If the cube isn’t a single cube.

CSET.operators.read

Operators for reading various types of files from disk.

exception CSET.operators.read.NoDataWarning

Warning that no data has been loaded.

CSET.operators.read.read_cube(loadpath: Path, constraint: Constraint = None, filename_pattern: str = '*', **kwargs) Cube

Read a single cube from files.

Read operator that takes a path string (can include wildcards), and uses iris to load the cube matching the constraint.

If the loaded data is split across multiple files, a filename_pattern can be specified to select the read files using Unix shell-style wildcards. In this case the loadpath should point to the directory containing the data.

Ensemble data can also be loaded. If it has a realization coordinate already, it will be directly used. If not, it will have its member number guessed from the filename, based on one of several common patterns. For example the pattern emXX, where XX is the realization.

Deterministic data will be loaded with a realization of 0, allowing it to be processed in the same way as ensemble data.

Parameters:
  • loadpath (pathlike) – Path to where .pp/.nc files are located

  • constraint (iris.Constraint | iris.ConstraintCombination) – Constraints to filter data by

  • filename_pattern (str, optional) – Unix shell-style pattern to match filenames to. Defaults to “*”

Returns:

cubes – Cube loaded

Return type:

iris.cube.Cube

Raises:
  • FileNotFoundError – If the provided path does not exist

  • ValueError – If the constraint doesn’t produce a single cube.

CSET.operators.read.read_cubes(loadpath: Path, constraint: Constraint = None, filename_pattern: str = '*', **kwargs) CubeList

Read cubes from files.

Read operator that takes a path string (can include wildcards), and uses iris to load_cube all the cubes matching the constraint and return a CubeList object.

If the loaded data is split across multiple files, a filename_pattern can be specified to select the read files using Unix shell-style wildcards. In this case the loadpath should point to the directory containing the data.

Ensemble data can also be loaded. If it has a realization coordinate already, it will be directly used. If not, it will have its member number guessed from the filename, based on one of several common patterns. For example the pattern emXX, where XX is the realization.

Deterministic data will be loaded with a realization of 0, allowing it to be processed in the same way as ensemble data.

Parameters:
  • loadpath (pathlike) – Path to where .pp/.nc files are located

  • constraint (iris.Constraint | iris.ConstraintCombination, optional) – Constraints to filter data by

  • filename_pattern (str, optional) – Unix shell-style pattern to match filenames to. Defaults to “*”

Returns:

cubes – Cubes loaded

Return type:

iris.cube.CubeList

Raises:

FileNotFoundError – If the provided path does not exist

CSET.operators.regrid

Operators to regrid cubes.

CSET.operators.regrid.regrid_onto_cube(incube: Cube, target: Cube, method: str, **kwargs) Cube

Regrid a cube, projecting onto a target cube.

Cube must have at least 2 dimensions.

Parameters:
  • incube (Cube) – An iris cube of the data to regrid. As a minimum, it needs to be 2D with a latitude, longitude coordinates.

  • target (Cube) – An iris cube of the data to regrid onto. It needs to be 2D with a latitude, longitude coordinate.

  • method (str) – Method used to regrid onto, etc. Linear will use iris.analysis.Linear()

Returns:

An iris cube of the data that has been regridded.

Return type:

iris.cube.Cube

Raises:
  • ValueError – If a unique x/y coordinate cannot be found

  • NotImplementedError – If the cubes grid, or the method for regridding, is not yet supported.

Notes

Currently rectlinear grids (uniform) are supported.

CSET.operators.regrid.regrid_onto_xyspacing(incube: Cube, xspacing: int, yspacing: int, method: str, **kwargs) Cube

Regrid cube onto a set x,y spacing.

Regrid cube using specified x,y spacing, which is performed linearly.

Parameters:
  • incube (Cube) – An iris cube of the data to regrid. As a minimum, it needs to be 2D with a latitude, longitude coordinates.

  • xspacing (integer) – Spacing of points in longitude direction (could be degrees, meters etc.)

  • yspacing (integer) – Spacing of points in latitude direction (could be degrees, meters etc.)

  • method (str) – Method used to regrid onto, etc. Linear will use iris.analysis.Linear()

Returns:

cube_rgd – An iris cube of the data that has been regridded.

Return type:

Cube

Raises:
  • ValueError – If a unique x/y coordinate cannot be found

  • NotImplementedError – If the cubes grid, or the method for regridding, is not yet supported.

Notes

Currently rectlinear grids (uniform) are supported.

CSET.operators.write

Operators for writing various types of files to disk.

CSET.operators.write.write_cube_to_nc(cube: Cube | CubeList, filename: str = None, overwrite: bool = False, **kwargs) str

Write a cube to a NetCDF file.

This operator expects an iris cube object that will then be saved to disk.

Parameters:
  • cube (iris.cube.Cube | iris.cube.CubeList) – Data to save.

  • filename (str, optional) – Path to save the cubes too. Defaults to the recipe title + .nc

  • overwrite (bool, optional) – Whether to overwrite an existing file. If False the filename will have a unique suffix added. Defaults to False.

Returns:

The inputted cube(list) (so further operations can be applied)

Return type:

Cube | CubeList

Convection Operators

CSET.operators.convection

A module containing different diagnostics for convection.

The diagnostics are calculated from output from the Unified Model, although precalculated values in the required input form may also be used.

CSET.operators.convection.cape_ratio(SBCAPE, MUCAPE, MUCIN, MUCIN_thresh=-75.0)

Ratio of two fields, one filtered to allow physical values to be output.

Parameters:
  • SBCAPE (Cube) – Surface-based convective available potential energy as calculated by the model. Stash: m01s20i114

  • MUCAPE (Cube) – Most-unstable convective available potential energy as calculated by the model. Stash: m01s20i112

  • MUCIN (Cube) – Most-unstable convective inhibition associated with the most-unstable ascent as calculated by the model. Stash: m01s20i113

  • MUCIN_thresh (float, optional, default is -75. J/kg.) – Threshold to filter the MUCAPE by values are realistically realisable.

Return type:

Cube

Notes

This diagnostic is based on Clark et al. (2012) [1]. It is based around the idea that for elevated convection the convective instability is not based at the surface. This utilises two flavours of CAPE: the surface-based CAPE (SBCAPE) and the most-unstable CAPE (MUCAPE). The MUCAPE is filtered by the MUCIN associated with that parcel’s ascent to ensure that any CAPE can at least theoretically be released. The default value is set at -75 J/kg but it can be changes depending on location and users requirements.

\[1 - (\frac{SBCAPE}{MUCAPE})\]

The ratio is defined in this way such that if SBCAPE=MUCAPE the ratio will equal 1. If the ratio was reversed when MUCAPE exists and SBCAPE is zero the ratio would be undefined.

The diagnostic varies smoothly between zero and unity. A value of 0 implies an environment is suitable for surface-based convection. A value of 1 implies an environment is suitable for elevated convection. Values between imply transition convection with values closer to one imply elevated convection is more likely and values closer to zero implying that surface-based convection is more likely.

Further details about this diagnostic for elevated convection identification can be found in Flack et al. (2023) [2].

Expected applicability ranges: Convective-scale models will be noisier than parametrized models as they are more responsive to the convection, and thus it may be more sensible to view as a larger spatial average rather than on the native resolution.

Interpretation notes: UM stash for CAPE and CIN are calculated at the end of the timestep. Therefore this diagnostic is applicable after precipitation has occurred, not before as is the usual interpretation of CAPE related diagnostics.

References

Examples

>>> CAPE_ratios=convection.cape_ratio(
        SBCAPE,MUCAPE,MUCIN)
>>> iplt.pcolormesh(CAPE_ratios[0,:,:],cmap=mpl.cm.RdBu)
>>> plt.gca().coastlines('10m')
>>> plt.colorbar()
>>> plt.clim(0,1)
>>> plt.show()
>>> CAPE_ratios=convection.cape_ratio(
        SBCAPE,MUCAPE,MUCIN,MUCIN_thresh=-1.5)
>>> iplt.pcolormesh(CAPE_ratios[0,:,:],cmap=mpl.cm.RdBu)
>>> plt.gca().coastlines('10m')
>>> plt.clim(0,1)
>>> plt.colorbar()
>>> plt.show()