🍾 Xarray is now 10 years old! 🎉

You can run this notebook in a live session Binder or view it on Github.

Applying unvectorized functions with apply_ufunc#

This example will illustrate how to conveniently apply an unvectorized function func to xarray objects using apply_ufunc. func expects 1D numpy arrays and returns a 1D numpy array. Our goal is to conveniently apply this function along a dimension of xarray objects that may or may not wrap dask arrays with a signature.

We will illustrate this using np.interp:

Signature: np.interp(x, xp, fp, left=None, right=None, period=None)
Docstring:
    One-dimensional linear interpolation.

Returns the one-dimensional piecewise linear interpolant to a function
with given discrete data points (`xp`, `fp`), evaluated at `x`.

and write an xr_interp function with signature

xr_interp(xarray_object, dimension_name, new_coordinate_to_interpolate_to)

Load data#

First lets load an example dataset

[1]:
import xarray as xr
import numpy as np

xr.set_options(display_style="html")  # fancy HTML repr

air = (
    xr.tutorial.load_dataset("air_temperature")
    .air.sortby("lat")  # np.interp needs coordinate in ascending order
    .isel(time=slice(4), lon=slice(3))
)  # choose a small subset for convenience
air
[1]:
<xarray.DataArray 'air' (time: 4, lat: 25, lon: 3)> Size: 1kB
array([[[296.29   , 296.79   , 297.1    ],
        [295.9    , 296.19998, 296.79   ],
        [296.6    , 296.19998, 296.4    ],
        [297.     , 296.69998, 296.1    ],
        [295.4    , 295.69998, 295.79   ],
        [293.79   , 294.1    , 294.6    ],
        [293.1    , 293.29   , 293.29   ],
        [290.19998, 290.79   , 291.4    ],
        [287.9    , 288.     , 288.29   ],
        [286.5    , 286.5    , 285.69998],
        [284.6    , 284.9    , 284.19998],
        [282.79   , 283.19998, 282.6    ],
        [280.     , 280.69998, 280.19998],
        [278.4    , 279.     , 279.     ],
        [277.29   , 277.4    , 277.79   ],
        [276.69998, 277.4    , 277.69998],
        [275.9    , 276.9    , 276.9    ],
        [274.79   , 275.19998, 275.6    ],
        [273.69998, 273.6    , 273.79   ],
        [272.1    , 270.9    , 270.     ],
...
        [293.     , 293.5    , 294.29   ],
        [291.9    , 291.9    , 292.19998],
        [289.19998, 289.4    , 289.9    ],
        [286.6    , 287.1    , 287.9    ],
        [284.79   , 284.79   , 285.4    ],
        [282.79   , 282.     , 282.69998],
        [281.19998, 280.19998, 280.6    ],
        [279.5    , 278.69998, 278.6    ],
        [278.     , 277.69998, 277.6    ],
        [276.4    , 275.9    , 276.4    ],
        [275.6    , 275.69998, 276.1    ],
        [274.5    , 275.6    , 276.29   ],
        [273.4    , 274.5    , 275.5    ],
        [274.1    , 274.     , 273.5    ],
        [273.29   , 272.6    , 271.5    ],
        [272.79   , 272.4    , 271.9    ],
        [267.69998, 266.29   , 264.4    ],
        [256.6    , 254.7    , 252.09999],
        [246.29999, 245.29999, 244.2    ],
        [241.89   , 241.79999, 241.79999]]], dtype=float32)
Coordinates:
  * lat      (lat) float32 100B 15.0 17.5 20.0 22.5 25.0 ... 67.5 70.0 72.5 75.0
  * lon      (lon) float32 12B 200.0 202.5 205.0
  * time     (time) datetime64[ns] 32B 2013-01-01 ... 2013-01-01T18:00:00
Attributes:
    long_name:     4xDaily Air temperature at sigma level 995
    units:         degK
    precision:     2
    GRIB_id:       11
    GRIB_name:     TMP
    var_desc:      Air temperature
    dataset:       NMC Reanalysis
    level_desc:    Surface
    statistic:     Individual Obs
    parent_stat:   Other
    actual_range:  [185.16 322.1 ]

The function we will apply is np.interp which expects 1D numpy arrays. This functionality is already implemented in xarray so we use that capability to make sure we are not making mistakes.

[2]:
newlat = np.linspace(15, 75, 100)
air.interp(lat=newlat)
[2]:
<xarray.DataArray 'air' (time: 4, lat: 100, lon: 3)> Size: 10kB
array([[[296.29000854, 296.79000854, 297.1000061 ],
        [296.19545954, 296.64697173, 297.02485518],
        [296.10091053, 296.50393491, 296.94970426],
        ...,
        [242.46059851, 243.46969695, 244.08181672],
        [241.83029767, 242.98484846, 243.79090837],
        [241.19999683, 242.49999997, 243.50000003]],

       [[296.29000854, 297.19998169, 297.3999939 ],
        [296.26818385, 297.07876957, 297.25211866],
        [296.24635916, 296.95755744, 297.10424342],
        ...,
        [242.82726354, 243.37878187, 243.63332714],
        [242.46362716, 243.03938941, 243.36665899],
        [242.09999079, 242.69999695, 243.09999084]],

       [[296.3999939 , 296.29000854, 296.3999939 ],
        [296.35150609, 296.34091556, 296.37333078],
        [296.30301828, 296.39182258, 296.34666767],
        ...,
        [243.4151408 , 243.26181628, 243.12423612],
        [242.85756431, 242.73090659, 242.71211194],
        [242.29998782, 242.19999689, 242.29998776]],

       [[297.5       , 297.69998169, 297.5       ],
        [297.37878788, 297.65150128, 297.40303179],
        [297.25757575, 297.60302087, 297.30606357],
        ...,
        [244.02817552, 243.49695752, 242.96362858],
        [242.9590874 , 242.64847269, 242.38180817],
        [241.88999927, 241.79998785, 241.79998776]]])
Coordinates:
  * lon      (lon) float32 12B 200.0 202.5 205.0
  * time     (time) datetime64[ns] 32B 2013-01-01 ... 2013-01-01T18:00:00
  * lat      (lat) float64 800B 15.0 15.61 16.21 16.82 ... 73.79 74.39 75.0
Attributes:
    long_name:     4xDaily Air temperature at sigma level 995
    units:         degK
    precision:     2
    GRIB_id:       11
    GRIB_name:     TMP
    var_desc:      Air temperature
    dataset:       NMC Reanalysis
    level_desc:    Surface
    statistic:     Individual Obs
    parent_stat:   Other
    actual_range:  [185.16 322.1 ]

Let’s define a function that works with one vector of data along lat at a time.

[3]:
def interp1d_np(data, x, xi):
    return np.interp(xi, x, data)


interped = interp1d_np(air.isel(time=0, lon=0), air.lat, newlat)
expected = air.interp(lat=newlat)

# no errors are raised if values are equal to within floating point precision
np.testing.assert_allclose(expected.isel(time=0, lon=0).values, interped)

No errors are raised so our interpolation is working.#

This function consumes and returns numpy arrays, which means we need to do a lot of work to convert the result back to an xarray object with meaningful metadata. This is where apply_ufunc is very useful.

apply_ufunc#

Apply a vectorized function for unlabeled arrays on xarray objects.

The function will be mapped over the data variable(s) of the input arguments using
xarray’s standard rules for labeled computation, including alignment, broadcasting,
looping over GroupBy/Dataset variables, and merging of coordinates.

apply_ufunc has many capabilities but for simplicity this example will focus on the common task of vectorizing 1D functions over nD xarray objects. We will iteratively build up the right set of arguments to apply_ufunc and read through many error messages in doing so.

[4]:
xr.apply_ufunc(
    interp1d_np,  # first the function
    air.isel(time=0, lon=0),  # now arguments in the order expected by 'interp1_np'
    air.lat,
    newlat,
)
---------------------------------------------------------------------------
ValueError                                Traceback (most recent call last)
Cell In[4], line 1
----> 1 xr.apply_ufunc(
      2     interp1d_np,  # first the function
      3     air.isel(time=0, lon=0),  # now arguments in the order expected by 'interp1_np'
      4     air.lat,
      5     newlat,
      6 )

File ~/checkouts/readthedocs.org/user_builds/xray/checkouts/stable/xarray/core/computation.py:1270, in apply_ufunc(func, input_core_dims, output_core_dims, exclude_dims, vectorize, join, dataset_join, dataset_fill_value, keep_attrs, kwargs, dask, output_dtypes, output_sizes, meta, dask_gufunc_kwargs, on_missing_core_dim, *args)
   1268 # feed DataArray apply_variable_ufunc through apply_dataarray_vfunc
   1269 elif any(isinstance(a, DataArray) for a in args):
-> 1270     return apply_dataarray_vfunc(
   1271         variables_vfunc,
   1272         *args,
   1273         signature=signature,
   1274         join=join,
   1275         exclude_dims=exclude_dims,
   1276         keep_attrs=keep_attrs,
   1277     )
   1278 # feed Variables directly through apply_variable_ufunc
   1279 elif any(isinstance(a, Variable) for a in args):

File ~/checkouts/readthedocs.org/user_builds/xray/checkouts/stable/xarray/core/computation.py:316, in apply_dataarray_vfunc(func, signature, join, exclude_dims, keep_attrs, *args)
    311 result_coords, result_indexes = build_output_coords_and_indexes(
    312     args, signature, exclude_dims, combine_attrs=keep_attrs
    313 )
    315 data_vars = [getattr(a, "variable", a) for a in args]
--> 316 result_var = func(*data_vars)
    318 out: tuple[DataArray, ...] | DataArray
    319 if signature.num_outputs > 1:

File ~/checkouts/readthedocs.org/user_builds/xray/checkouts/stable/xarray/core/computation.py:860, in apply_variable_ufunc(func, signature, exclude_dims, dask, output_dtypes, vectorize, keep_attrs, dask_gufunc_kwargs, *args)
    858 for dim, new_size in var.sizes.items():
    859     if dim in dim_sizes and new_size != dim_sizes[dim]:
--> 860         raise ValueError(
    861             f"size of dimension '{dim}' on inputs was unexpectedly "
    862             f"changed by applied function from {dim_sizes[dim]} to {new_size}. Only "
    863             "dimensions specified in ``exclude_dims`` with "
    864             "xarray.apply_ufunc are allowed to change size. "
    865             "The data returned was:\n\n"
    866             f"{short_array_repr(data)}"
    867         )
    869 var.attrs = attrs
    870 output.append(var)

ValueError: size of dimension 'lat' on inputs was unexpectedly changed by applied function from 25 to 100. Only dimensions specified in ``exclude_dims`` with xarray.apply_ufunc are allowed to change size. The data returned was:

array([296.290009, 296.19546 , 296.100911, 296.006362, 295.911813, 296.048481,
       296.218181, 296.387881, 296.557581, 296.672732, 296.7697  , 296.866669,
       296.963637, 296.757575, 296.369695, 295.981814, 295.593934, 295.204844,
       294.814545, 294.424245, 294.033946, 293.727281, 293.560008, 293.392734,
       293.225461, 292.924247, 292.221211, 291.518175, 290.815138, 290.130285,
       289.572712, 289.015139, 288.457567, 287.899994, 287.560601, 287.221209,
       286.881817, 286.542424, 286.096971, 285.636366, 285.175762, 284.715157,
       284.270916, 283.832128, 283.393341, 282.954554, 282.36728 , 281.690914,
       281.014549, 280.338183, 279.80606 , 279.41818 , 279.030299, 278.642419,
       278.299086, 278.029999, 277.760911, 277.491824, 277.254249, 277.111213,
       276.968176, 276.825139, 276.67574 , 276.481803, 276.287867, 276.09393 ,
       275.899994, 275.630907, 275.361819, 275.092732, 274.823644, 274.558791,
       274.294542, 274.030293, 273.766044, 273.409077, 273.021204, 272.633331,
       272.245458, 272.463642, 273.045458, 273.627275, 274.209092, 273.530303,
       271.590909, 269.651515, 267.712121, 265.      , 261.      , 257.      ,
       253.      , 249.624242, 248.121208, 246.618175, 245.115142, 243.7212  ,
       243.090899, 242.460599, 241.830298, 241.199997])

apply_ufunc needs to know a lot of information about what our function does so that it can reconstruct the outputs. In this case, the size of dimension lat has changed and we need to explicitly specify that this will happen. xarray helpfully tells us that we need to specify the kwarg exclude_dims.

exclude_dims#

exclude_dims : set, optional
        Core dimensions on the inputs to exclude from alignment and
        broadcasting entirely. Any input coordinates along these dimensions
        will be dropped. Each excluded dimension must also appear in
        ``input_core_dims`` for at least one argument. Only dimensions listed
        here are allowed to change size between input and output objects.
[5]:
xr.apply_ufunc(
    interp1d_np,  # first the function
    air.isel(time=0, lon=0),  # now arguments in the order expected by 'interp1_np'
    air.lat,
    newlat,
    exclude_dims=set(("lat",)),  # dimensions allowed to change size. Must be set!
)
---------------------------------------------------------------------------
ValueError                                Traceback (most recent call last)
Cell In[5], line 1
----> 1 xr.apply_ufunc(
      2     interp1d_np,  # first the function
      3     air.isel(time=0, lon=0),  # now arguments in the order expected by 'interp1_np'
      4     air.lat,
      5     newlat,
      6     exclude_dims=set(("lat",)),  # dimensions allowed to change size. Must be set!
      7 )

File ~/checkouts/readthedocs.org/user_builds/xray/checkouts/stable/xarray/core/computation.py:1185, in apply_ufunc(func, input_core_dims, output_core_dims, exclude_dims, vectorize, join, dataset_join, dataset_fill_value, keep_attrs, kwargs, dask, output_dtypes, output_sizes, meta, dask_gufunc_kwargs, on_missing_core_dim, *args)
   1181         raise TypeError(
   1182             f"Expected exclude_dims to be a 'set'. Received '{type(exclude_dims).__name__}' instead."
   1183         )
   1184     if not exclude_dims <= signature.all_core_dims:
-> 1185         raise ValueError(
   1186             f"each dimension in `exclude_dims` must also be a "
   1187             f"core dimension in the function signature. "
   1188             f"Please make {(exclude_dims - signature.all_core_dims)} a core dimension"
   1189         )
   1191 # handle dask_gufunc_kwargs
   1192 if dask == "parallelized":

ValueError: each dimension in `exclude_dims` must also be a core dimension in the function signature. Please make {'lat'} a core dimension

Core dimensions#

Core dimensions are central to using apply_ufunc. In our case, our function expects to receive a 1D vector along lat — this is the dimension that is “core” to the function’s functionality. Multiple core dimensions are possible. apply_ufunc needs to know which dimensions of each variable are core dimensions.

input_core_dims : Sequence[Sequence], optional
    List of the same length as ``args`` giving the list of core dimensions
    on each input argument that should not be broadcast. By default, we
    assume there are no core dimensions on any input arguments.

    For example, ``input_core_dims=[[], ['time']]`` indicates that all
    dimensions on the first argument and all dimensions other than 'time'
    on the second argument should be broadcast.

    Core dimensions are automatically moved to the last axes of input
    variables before applying ``func``, which facilitates using NumPy style
    generalized ufuncs [2]_.

output_core_dims : List[tuple], optional
    List of the same length as the number of output arguments from
    ``func``, giving the list of core dimensions on each output that were
    not broadcast on the inputs. By default, we assume that ``func``
    outputs exactly one array, with axes corresponding to each broadcast
    dimension.

    Core dimensions are assumed to appear as the last dimensions of each
    output in the provided order.

Next we specify "lat" as input_core_dims on both air and air.lat

[6]:
xr.apply_ufunc(
    interp1d_np,  # first the function
    air.isel(time=0, lon=0),  # now arguments in the order expected by 'interp1_np'
    air.lat,
    newlat,
    input_core_dims=[["lat"], ["lat"], []],
    exclude_dims=set(("lat",)),  # dimensions allowed to change size. Must be set!
)
---------------------------------------------------------------------------
ValueError                                Traceback (most recent call last)
Cell In[6], line 1
----> 1 xr.apply_ufunc(
      2     interp1d_np,  # first the function
      3     air.isel(time=0, lon=0),  # now arguments in the order expected by 'interp1_np'
      4     air.lat,
      5     newlat,
      6     input_core_dims=[["lat"], ["lat"], []],
      7     exclude_dims=set(("lat",)),  # dimensions allowed to change size. Must be set!
      8 )

File ~/checkouts/readthedocs.org/user_builds/xray/checkouts/stable/xarray/core/computation.py:1270, in apply_ufunc(func, input_core_dims, output_core_dims, exclude_dims, vectorize, join, dataset_join, dataset_fill_value, keep_attrs, kwargs, dask, output_dtypes, output_sizes, meta, dask_gufunc_kwargs, on_missing_core_dim, *args)
   1268 # feed DataArray apply_variable_ufunc through apply_dataarray_vfunc
   1269 elif any(isinstance(a, DataArray) for a in args):
-> 1270     return apply_dataarray_vfunc(
   1271         variables_vfunc,
   1272         *args,
   1273         signature=signature,
   1274         join=join,
   1275         exclude_dims=exclude_dims,
   1276         keep_attrs=keep_attrs,
   1277     )
   1278 # feed Variables directly through apply_variable_ufunc
   1279 elif any(isinstance(a, Variable) for a in args):

File ~/checkouts/readthedocs.org/user_builds/xray/checkouts/stable/xarray/core/computation.py:316, in apply_dataarray_vfunc(func, signature, join, exclude_dims, keep_attrs, *args)
    311 result_coords, result_indexes = build_output_coords_and_indexes(
    312     args, signature, exclude_dims, combine_attrs=keep_attrs
    313 )
    315 data_vars = [getattr(a, "variable", a) for a in args]
--> 316 result_var = func(*data_vars)
    318 out: tuple[DataArray, ...] | DataArray
    319 if signature.num_outputs > 1:

File ~/checkouts/readthedocs.org/user_builds/xray/checkouts/stable/xarray/core/computation.py:850, in apply_variable_ufunc(func, signature, exclude_dims, dask, output_dtypes, vectorize, keep_attrs, dask_gufunc_kwargs, *args)
    848 data = as_compatible_data(data)
    849 if data.ndim != len(dims):
--> 850     raise ValueError(
    851         "applied function returned data with an unexpected "
    852         f"number of dimensions. Received {data.ndim} dimension(s) but "
    853         f"expected {len(dims)} dimensions with names {dims!r}, from:\n\n"
    854         f"{short_array_repr(data)}"
    855     )
    857 var = Variable(dims, data, fastpath=True)
    858 for dim, new_size in var.sizes.items():

ValueError: applied function returned data with an unexpected number of dimensions. Received 1 dimension(s) but expected 0 dimensions with names (), from:

array([296.290009, 296.19546 , 296.100911, 296.006362, 295.911813, 296.048481,
       296.218181, 296.387881, 296.557581, 296.672732, 296.7697  , 296.866669,
       296.963637, 296.757575, 296.369695, 295.981814, 295.593934, 295.204844,
       294.814545, 294.424245, 294.033946, 293.727281, 293.560008, 293.392734,
       293.225461, 292.924247, 292.221211, 291.518175, 290.815138, 290.130285,
       289.572712, 289.015139, 288.457567, 287.899994, 287.560601, 287.221209,
       286.881817, 286.542424, 286.096971, 285.636366, 285.175762, 284.715157,
       284.270916, 283.832128, 283.393341, 282.954554, 282.36728 , 281.690914,
       281.014549, 280.338183, 279.80606 , 279.41818 , 279.030299, 278.642419,
       278.299086, 278.029999, 277.760911, 277.491824, 277.254249, 277.111213,
       276.968176, 276.825139, 276.67574 , 276.481803, 276.287867, 276.09393 ,
       275.899994, 275.630907, 275.361819, 275.092732, 274.823644, 274.558791,
       274.294542, 274.030293, 273.766044, 273.409077, 273.021204, 272.633331,
       272.245458, 272.463642, 273.045458, 273.627275, 274.209092, 273.530303,
       271.590909, 269.651515, 267.712121, 265.      , 261.      , 257.      ,
       253.      , 249.624242, 248.121208, 246.618175, 245.115142, 243.7212  ,
       243.090899, 242.460599, 241.830298, 241.199997])

xarray is telling us that it expected to receive back a numpy array with 0 dimensions but instead received an array with 1 dimension corresponding to newlat. We can fix this by specifying output_core_dims

[7]:
xr.apply_ufunc(
    interp1d_np,  # first the function
    air.isel(time=0, lon=0),  # now arguments in the order expected by 'interp1_np'
    air.lat,
    newlat,
    input_core_dims=[["lat"], ["lat"], []],  # list with one entry per arg
    output_core_dims=[["lat"]],
    exclude_dims=set(("lat",)),  # dimensions allowed to change size. Must be set!
)
[7]:
<xarray.DataArray (lat: 100)> Size: 800B
array([296.29000854, 296.19545954, 296.10091053, 296.00636153,
       295.91181252, 296.04848133, 296.21818126, 296.38788119,
       296.55758112, 296.67273227, 296.76970048, 296.8666687 ,
       296.96363692, 296.75757483, 296.36969457, 295.9818143 ,
       295.59393403, 295.20484416, 294.81454468, 294.4242452 ,
       294.03394572, 293.72728105, 293.56000773, 293.39273441,
       293.22546109, 292.92424705, 292.22121083, 291.5181746 ,
       290.81513838, 290.13028509, 289.57271229, 289.01513949,
       288.45756669, 287.8999939 , 287.56060144, 287.22120898,
       286.88181652, 286.54242406, 286.09697099, 285.63636641,
       285.17576183, 284.71515725, 284.27091564, 283.83212835,
       283.39334106, 282.95455378, 282.36727998, 281.69091427,
       281.01454856, 280.33818285, 279.80605987, 279.4181796 ,
       279.03029933, 278.64241906, 278.29908614, 278.02999878,
       277.76091142, 277.49182406, 277.25424934, 277.11121253,
       276.96817571, 276.8251389 , 276.67573964, 276.4818032 ,
       276.28786677, 276.09393033, 275.8999939 , 275.63090654,
       275.36181918, 275.09273182, 274.82364446, 274.55879073,
       274.29454179, 274.03029286, 273.76604392, 273.40907704,
       273.02120417, 272.6333313 , 272.24545843, 272.46364154,
       273.04545824, 273.62727495, 274.20909165, 273.53030303,
       271.59090909, 269.65151515, 267.71212121, 265.        ,
       261.        , 257.        , 253.        , 249.62424168,
       248.12120842, 246.61817516, 245.1151419 , 243.72120019,
       243.09089938, 242.46059857, 241.83029776, 241.19999695])
Coordinates:
    lon      float32 4B 200.0
    time     datetime64[ns] 8B 2013-01-01
Dimensions without coordinates: lat

Finally we get some output! Let’s check that this is right

[8]:
interped = xr.apply_ufunc(
    interp1d_np,  # first the function
    air.isel(time=0, lon=0),  # now arguments in the order expected by 'interp1_np'
    air.lat,
    newlat,
    input_core_dims=[["lat"], ["lat"], []],  # list with one entry per arg
    output_core_dims=[["lat"]],
    exclude_dims=set(("lat",)),  # dimensions allowed to change size. Must be set!
)
interped["lat"] = newlat  # need to add this manually
xr.testing.assert_allclose(expected.isel(time=0, lon=0), interped)

No errors are raised so it is right!

Vectorization with np.vectorize#

Now our function currently only works on one vector of data which is not so useful given our 3D dataset. Let’s try passing the whole dataset. We add a print statement so we can see what our function receives.

[9]:
def interp1d_np(data, x, xi):
    print(f"data: {data.shape} | x: {x.shape} | xi: {xi.shape}")
    return np.interp(xi, x, data)


interped = xr.apply_ufunc(
    interp1d_np,  # first the function
    air.isel(
        lon=slice(3), time=slice(4)
    ),  # now arguments in the order expected by 'interp1_np'
    air.lat,
    newlat,
    input_core_dims=[["lat"], ["lat"], []],  # list with one entry per arg
    output_core_dims=[["lat"]],
    exclude_dims=set(("lat",)),  # dimensions allowed to change size. Must be set!
)
interped["lat"] = newlat  # need to add this manually
xr.testing.assert_allclose(expected.isel(time=0, lon=0), interped)
data: (4, 3, 25) | x: (25,) | xi: (100,)
---------------------------------------------------------------------------
ValueError                                Traceback (most recent call last)
Cell In[9], line 6
      2     print(f"data: {data.shape} | x: {x.shape} | xi: {xi.shape}")
      3     return np.interp(xi, x, data)
----> 6 interped = xr.apply_ufunc(
      7     interp1d_np,  # first the function
      8     air.isel(
      9         lon=slice(3), time=slice(4)
     10     ),  # now arguments in the order expected by 'interp1_np'
     11     air.lat,
     12     newlat,
     13     input_core_dims=[["lat"], ["lat"], []],  # list with one entry per arg
     14     output_core_dims=[["lat"]],
     15     exclude_dims=set(("lat",)),  # dimensions allowed to change size. Must be set!
     16 )
     17 interped["lat"] = newlat  # need to add this manually
     18 xr.testing.assert_allclose(expected.isel(time=0, lon=0), interped)

File ~/checkouts/readthedocs.org/user_builds/xray/checkouts/stable/xarray/core/computation.py:1270, in apply_ufunc(func, input_core_dims, output_core_dims, exclude_dims, vectorize, join, dataset_join, dataset_fill_value, keep_attrs, kwargs, dask, output_dtypes, output_sizes, meta, dask_gufunc_kwargs, on_missing_core_dim, *args)
   1268 # feed DataArray apply_variable_ufunc through apply_dataarray_vfunc
   1269 elif any(isinstance(a, DataArray) for a in args):
-> 1270     return apply_dataarray_vfunc(
   1271         variables_vfunc,
   1272         *args,
   1273         signature=signature,
   1274         join=join,
   1275         exclude_dims=exclude_dims,
   1276         keep_attrs=keep_attrs,
   1277     )
   1278 # feed Variables directly through apply_variable_ufunc
   1279 elif any(isinstance(a, Variable) for a in args):

File ~/checkouts/readthedocs.org/user_builds/xray/checkouts/stable/xarray/core/computation.py:316, in apply_dataarray_vfunc(func, signature, join, exclude_dims, keep_attrs, *args)
    311 result_coords, result_indexes = build_output_coords_and_indexes(
    312     args, signature, exclude_dims, combine_attrs=keep_attrs
    313 )
    315 data_vars = [getattr(a, "variable", a) for a in args]
--> 316 result_var = func(*data_vars)
    318 out: tuple[DataArray, ...] | DataArray
    319 if signature.num_outputs > 1:

File ~/checkouts/readthedocs.org/user_builds/xray/checkouts/stable/xarray/core/computation.py:825, in apply_variable_ufunc(func, signature, exclude_dims, dask, output_dtypes, vectorize, keep_attrs, dask_gufunc_kwargs, *args)
    820     if vectorize:
    821         func = _vectorize(
    822             func, signature, output_dtypes=output_dtypes, exclude_dims=exclude_dims
    823         )
--> 825 result_data = func(*input_data)
    827 if signature.num_outputs == 1:
    828     result_data = (result_data,)

Cell In[9], line 3, in interp1d_np(data, x, xi)
      1 def interp1d_np(data, x, xi):
      2     print(f"data: {data.shape} | x: {x.shape} | xi: {xi.shape}")
----> 3     return np.interp(xi, x, data)

File ~/checkouts/readthedocs.org/user_builds/xray/conda/stable/lib/python3.10/site-packages/numpy/lib/function_base.py:1599, in interp(x, xp, fp, left, right, period)
   1596     xp = np.concatenate((xp[-1:]-period, xp, xp[0:1]+period))
   1597     fp = np.concatenate((fp[-1:], fp, fp[0:1]))
-> 1599 return interp_func(x, xp, fp, left, right)

ValueError: object too deep for desired array

That’s a hard-to-interpret error but our print call helpfully printed the shapes of the input data:

data: (10, 53, 25) | x: (25,) | xi: (100,)

We see that air has been passed as a 3D numpy array which is not what np.interp expects. Instead we want loop over all combinations of lon and time; and apply our function to each corresponding vector of data along lat. apply_ufunc makes this easy by specifying vectorize=True:

vectorize : bool, optional
    If True, then assume ``func`` only takes arrays defined over core
    dimensions as input and vectorize it automatically with
    :py:func:`numpy.vectorize`. This option exists for convenience, but is
    almost always slower than supplying a pre-vectorized function.
    Using this option requires NumPy version 1.12 or newer.

Also see the documentation for np.vectorize: https://docs.scipy.org/doc/numpy/reference/generated/numpy.vectorize.html. Most importantly

The vectorize function is provided primarily for convenience, not for performance.
The implementation is essentially a for loop.
[10]:
def interp1d_np(data, x, xi):
    print(f"data: {data.shape} | x: {x.shape} | xi: {xi.shape}")
    return np.interp(xi, x, data)


interped = xr.apply_ufunc(
    interp1d_np,  # first the function
    air,  # now arguments in the order expected by 'interp1_np'
    air.lat,  # as above
    newlat,  # as above
    input_core_dims=[["lat"], ["lat"], []],  # list with one entry per arg
    output_core_dims=[["lat"]],  # returned data has one dimension
    exclude_dims=set(("lat",)),  # dimensions allowed to change size. Must be set!
    vectorize=True,  # loop over non-core dims
)
interped["lat"] = newlat  # need to add this manually
xr.testing.assert_allclose(expected, interped)
---------------------------------------------------------------------------
ValueError                                Traceback (most recent call last)
Cell In[10], line 6
      2     print(f"data: {data.shape} | x: {x.shape} | xi: {xi.shape}")
      3     return np.interp(xi, x, data)
----> 6 interped = xr.apply_ufunc(
      7     interp1d_np,  # first the function
      8     air,  # now arguments in the order expected by 'interp1_np'
      9     air.lat,  # as above
     10     newlat,  # as above
     11     input_core_dims=[["lat"], ["lat"], []],  # list with one entry per arg
     12     output_core_dims=[["lat"]],  # returned data has one dimension
     13     exclude_dims=set(("lat",)),  # dimensions allowed to change size. Must be set!
     14     vectorize=True,  # loop over non-core dims
     15 )
     16 interped["lat"] = newlat  # need to add this manually
     17 xr.testing.assert_allclose(expected, interped)

File ~/checkouts/readthedocs.org/user_builds/xray/checkouts/stable/xarray/core/computation.py:1270, in apply_ufunc(func, input_core_dims, output_core_dims, exclude_dims, vectorize, join, dataset_join, dataset_fill_value, keep_attrs, kwargs, dask, output_dtypes, output_sizes, meta, dask_gufunc_kwargs, on_missing_core_dim, *args)
   1268 # feed DataArray apply_variable_ufunc through apply_dataarray_vfunc
   1269 elif any(isinstance(a, DataArray) for a in args):
-> 1270     return apply_dataarray_vfunc(
   1271         variables_vfunc,
   1272         *args,
   1273         signature=signature,
   1274         join=join,
   1275         exclude_dims=exclude_dims,
   1276         keep_attrs=keep_attrs,
   1277     )
   1278 # feed Variables directly through apply_variable_ufunc
   1279 elif any(isinstance(a, Variable) for a in args):

File ~/checkouts/readthedocs.org/user_builds/xray/checkouts/stable/xarray/core/computation.py:316, in apply_dataarray_vfunc(func, signature, join, exclude_dims, keep_attrs, *args)
    311 result_coords, result_indexes = build_output_coords_and_indexes(
    312     args, signature, exclude_dims, combine_attrs=keep_attrs
    313 )
    315 data_vars = [getattr(a, "variable", a) for a in args]
--> 316 result_var = func(*data_vars)
    318 out: tuple[DataArray, ...] | DataArray
    319 if signature.num_outputs > 1:

File ~/checkouts/readthedocs.org/user_builds/xray/checkouts/stable/xarray/core/computation.py:825, in apply_variable_ufunc(func, signature, exclude_dims, dask, output_dtypes, vectorize, keep_attrs, dask_gufunc_kwargs, *args)
    820     if vectorize:
    821         func = _vectorize(
    822             func, signature, output_dtypes=output_dtypes, exclude_dims=exclude_dims
    823         )
--> 825 result_data = func(*input_data)
    827 if signature.num_outputs == 1:
    828     result_data = (result_data,)

File ~/checkouts/readthedocs.org/user_builds/xray/conda/stable/lib/python3.10/site-packages/numpy/lib/function_base.py:2372, in vectorize.__call__(self, *args, **kwargs)
   2369     self._init_stage_2(*args, **kwargs)
   2370     return self
-> 2372 return self._call_as_normal(*args, **kwargs)

File ~/checkouts/readthedocs.org/user_builds/xray/conda/stable/lib/python3.10/site-packages/numpy/lib/function_base.py:2365, in vectorize._call_as_normal(self, *args, **kwargs)
   2362     vargs = [args[_i] for _i in inds]
   2363     vargs.extend([kwargs[_n] for _n in names])
-> 2365 return self._vectorize_call(func=func, args=vargs)

File ~/checkouts/readthedocs.org/user_builds/xray/conda/stable/lib/python3.10/site-packages/numpy/lib/function_base.py:2446, in vectorize._vectorize_call(self, func, args)
   2444 """Vectorized call to `func` over positional `args`."""
   2445 if self.signature is not None:
-> 2446     res = self._vectorize_call_with_signature(func, args)
   2447 elif not args:
   2448     res = func()

File ~/checkouts/readthedocs.org/user_builds/xray/conda/stable/lib/python3.10/site-packages/numpy/lib/function_base.py:2474, in vectorize._vectorize_call_with_signature(self, func, args)
   2469     raise TypeError('wrong number of positional arguments: '
   2470                     'expected %r, got %r'
   2471                     % (len(input_core_dims), len(args)))
   2472 args = tuple(asanyarray(arg) for arg in args)
-> 2474 broadcast_shape, dim_sizes = _parse_input_dimensions(
   2475     args, input_core_dims)
   2476 input_shapes = _calculate_shapes(broadcast_shape, dim_sizes,
   2477                                  input_core_dims)
   2478 args = [np.broadcast_to(arg, shape, subok=True)
   2479         for arg, shape in zip(args, input_shapes)]

File ~/checkouts/readthedocs.org/user_builds/xray/conda/stable/lib/python3.10/site-packages/numpy/lib/function_base.py:2091, in _parse_input_dimensions(args, input_core_dims)
   2089     dummy_array = np.lib.stride_tricks.as_strided(0, arg.shape[:ndim])
   2090     broadcast_args.append(dummy_array)
-> 2091 broadcast_shape = np.lib.stride_tricks._broadcast_shape(*broadcast_args)
   2092 return broadcast_shape, dim_sizes

File ~/checkouts/readthedocs.org/user_builds/xray/conda/stable/lib/python3.10/site-packages/numpy/lib/stride_tricks.py:422, in _broadcast_shape(*args)
    417 """Returns the shape of the arrays that would result from broadcasting the
    418 supplied arrays against each other.
    419 """
    420 # use the old-iterator because np.nditer does not handle size 0 arrays
    421 # consistently
--> 422 b = np.broadcast(*args[:32])
    423 # unfortunately, it cannot handle 32 or more arguments directly
    424 for pos in range(32, len(args), 31):
    425     # ironically, np.broadcast does not properly handle np.broadcast
    426     # objects (it treats them as scalars)
    427     # use broadcasting to avoid allocating the full array

ValueError: shape mismatch: objects cannot be broadcast to a single shape.  Mismatch is between arg 0 with shape (4, 3) and arg 2 with shape (100,).

This unfortunately is another cryptic error from numpy.

Notice that newlat is not an xarray object. Let’s add a dimension name new_lat and modify the call. Note this cannot be lat because xarray expects dimensions to be the same size (or broadcastable) among all inputs. output_core_dims needs to be modified appropriately. We’ll manually rename new_lat back to lat for easy checking.

[11]:
def interp1d_np(data, x, xi):
    print(f"data: {data.shape} | x: {x.shape} | xi: {xi.shape}")
    return np.interp(xi, x, data)


interped = xr.apply_ufunc(
    interp1d_np,  # first the function
    air,  # now arguments in the order expected by 'interp1_np'
    air.lat,  # as above
    newlat,  # as above
    input_core_dims=[["lat"], ["lat"], ["new_lat"]],  # list with one entry per arg
    output_core_dims=[["new_lat"]],  # returned data has one dimension
    exclude_dims=set(("lat",)),  # dimensions allowed to change size. Must be a set!
    vectorize=True,  # loop over non-core dims
)
interped = interped.rename({"new_lat": "lat"})
interped["lat"] = newlat  # need to add this manually
xr.testing.assert_allclose(
    expected.transpose(*interped.dims), interped  # order of dims is different
)
interped
data: (25,) | x: (25,) | xi: (100,)
data: (25,) | x: (25,) | xi: (100,)
data: (25,) | x: (25,) | xi: (100,)
data: (25,) | x: (25,) | xi: (100,)
data: (25,) | x: (25,) | xi: (100,)
data: (25,) | x: (25,) | xi: (100,)
data: (25,) | x: (25,) | xi: (100,)
data: (25,) | x: (25,) | xi: (100,)
data: (25,) | x: (25,) | xi: (100,)
data: (25,) | x: (25,) | xi: (100,)
data: (25,) | x: (25,) | xi: (100,)
data: (25,) | x: (25,) | xi: (100,)
[11]:
<xarray.DataArray (time: 4, lon: 3, lat: 100)> Size: 10kB
array([[[296.29000854, 296.19545954, 296.10091053, ..., 242.46059857,
         241.83029776, 241.19999695],
        [296.79000854, 296.64697173, 296.50393492, ..., 243.46969697,
         242.98484848, 242.5       ],
        [297.1000061 , 297.02485518, 296.94970426, ..., 244.0818167 ,
         243.79090835, 243.5       ]],

       [[296.29000854, 296.26818385, 296.24635916, ..., 242.82726357,
         242.46362721, 242.09999084],
        [297.19998169, 297.07876957, 296.95755745, ..., 243.37878187,
         243.03938941, 242.69999695],
        [297.3999939 , 297.25211866, 297.10424342, ..., 243.63332714,
         243.36665899, 243.09999084]],

       [[296.3999939 , 296.35150609, 296.30301828, ..., 243.41514079,
         242.85756429, 242.29998779],
        [296.29000854, 296.34091556, 296.39182258, ..., 243.26181631,
         242.73090663, 242.19999695],
        [296.3999939 , 296.37333078, 296.34666767, ..., 243.12423614,
         242.71211196, 242.29998779]],

       [[297.5       , 297.37878788, 297.25757576, ..., 244.02817559,
         242.95908749, 241.88999939],
        [297.69998169, 297.65150128, 297.60302087, ..., 243.49695749,
         242.64847264, 241.79998779],
        [297.5       , 297.40303178, 297.30606357, ..., 242.9636286 ,
         242.38180819, 241.79998779]]])
Coordinates:
  * lon      (lon) float32 12B 200.0 202.5 205.0
  * time     (time) datetime64[ns] 32B 2013-01-01 ... 2013-01-01T18:00:00
  * lat      (lat) float64 800B 15.0 15.61 16.21 16.82 ... 73.79 74.39 75.0

Notice that the printed input shapes are all 1D and correspond to one vector along the lat dimension.

The result is now an xarray object with coordinate values copied over from data. This is why apply_ufunc is so convenient; it takes care of a lot of boilerplate necessary to apply functions that consume and produce numpy arrays to xarray objects.

One final point: lat is now the last dimension in interped. This is a “property” of core dimensions: they are moved to the end before being sent to interp1d_np as was noted in the docstring for input_core_dims

Core dimensions are automatically moved to the last axes of input
variables before applying ``func``, which facilitates using NumPy style
generalized ufuncs [2]_.

Parallelization with dask#

So far our function can only handle numpy arrays. A real benefit of apply_ufunc is the ability to easily parallelize over dask chunks when needed.

We want to apply this function in a vectorized fashion over each chunk of the dask array. This is possible using dask’s blockwise, map_blocks, or apply_gufunc. Xarray’s apply_ufunc wraps dask’s apply_gufunc and asking it to map the function over chunks using apply_gufunc is as simple as specifying dask="parallelized". With this level of flexibility we need to provide dask with some extra information: 1. output_dtypes: dtypes of all returned objects, and 2. output_sizes: lengths of any new dimensions.

Here we need to specify output_dtypes since apply_ufunc can infer the size of the new dimension new_lat from the argument corresponding to the third element in input_core_dims. Here I choose the chunk sizes to illustrate that np.vectorize is still applied so that our function receives 1D vectors even though the blocks are 3D.

[12]:
def interp1d_np(data, x, xi):
    print(f"data: {data.shape} | x: {x.shape} | xi: {xi.shape}")
    return np.interp(xi, x, data)


interped = xr.apply_ufunc(
    interp1d_np,  # first the function
    air.chunk(
        {"time": 2, "lon": 2}
    ),  # now arguments in the order expected by 'interp1_np'
    air.lat,  # as above
    newlat,  # as above
    input_core_dims=[["lat"], ["lat"], ["new_lat"]],  # list with one entry per arg
    output_core_dims=[["new_lat"]],  # returned data has one dimension
    exclude_dims=set(("lat",)),  # dimensions allowed to change size. Must be a set!
    vectorize=True,  # loop over non-core dims
    dask="parallelized",
    output_dtypes=[air.dtype],  # one per output
).rename({"new_lat": "lat"})
interped["lat"] = newlat  # need to add this manually
xr.testing.assert_allclose(expected.transpose(*interped.dims), interped)
data: (25,) | x: (25,) | xi: (100,)
data: (25,) | x: (25,) | xi: (100,)
data: (25,) | x: (25,) | xi: (100,)
data: (25,) | x: (25,) | xi: (100,)
data: (25,) | x: (25,) | xi: (100,)
data: (25,) | x: (25,) | xi: (100,)
data: (25,) | x: (25,) | xi: (100,)
data: (25,) | x: (25,) | xi: (100,)
data: (25,) | x: (25,) | xi: (100,)
data: (25,) | x: (25,) | xi: (100,)
data: (25,) | x: (25,) | xi: (100,)
data: (25,) | x: (25,) | xi: (100,)

Yay! our function is receiving 1D vectors, so we’ve successfully parallelized applying a 1D function over a block. If you have a distributed dashboard up, you should see computes happening as equality is checked.

High performance vectorization: gufuncs, numba & guvectorize#

np.vectorize is a very convenient function but is unfortunately slow. It is only marginally faster than writing a for loop in Python and looping. A common way to get around this is to write a base interpolation function that can handle nD arrays in a compiled language like Fortran and then pass that to apply_ufunc.

Another option is to use the numba package which provides a very convenient guvectorize decorator: https://numba.pydata.org/numba-doc/latest/user/vectorize.html#the-guvectorize-decorator

Any decorated function gets compiled and will loop over any non-core dimension in parallel when necessary. We need to specify some extra information:

  1. Our function cannot return a variable any more. Instead it must receive a variable (the last argument) whose contents the function will modify. So we change from def interp1d_np(data, x, xi) to def interp1d_np_gufunc(data, x, xi, out). Our computed results must be assigned to out. All values of out must be assigned explicitly.

  2. guvectorize needs to know the dtypes of the input and output. This is specified in string form as the first argument. Each element of the tuple corresponds to each argument of the function. In this case, we specify float64 for all inputs and outputs: "(float64[:], float64[:], float64[:], float64[:])" corresponding to data, x, xi, out

  3. Now we need to tell numba the size of the dimensions the function takes as inputs and returns as output i.e. core dimensions. This is done in symbolic form i.e. data and x are vectors of the same length, say n; xi and the output out have a different length, say m. So the second argument is (again as a string) "(n), (n), (m) -> (m)." corresponding again to data, x, xi, out

[13]:
from numba import float64, guvectorize


@guvectorize("(float64[:], float64[:], float64[:], float64[:])", "(n), (n), (m) -> (m)")
def interp1d_np_gufunc(data, x, xi, out):
    # numba doesn't really like this.
    # seem to support fstrings so do it the old way
    print(
        "data: " + str(data.shape) + " | x:" + str(x.shape) + " | xi: " + str(xi.shape)
    )
    out[:] = np.interp(xi, x, data)
    # gufuncs don't return data
    # instead you assign to a the last arg
    # return np.interp(xi, x, data)

The warnings are about object-mode compilation relating to the print statement. This means we don’t get much speed up: https://numba.pydata.org/numba-doc/latest/user/performance-tips.html#no-python-mode-vs-object-mode. We’ll keep the print statement temporarily to make sure that guvectorize acts like we want it to.

[14]:
interped = xr.apply_ufunc(
    interp1d_np_gufunc,  # first the function
    air.chunk(
        {"time": 2, "lon": 2}
    ),  # now arguments in the order expected by 'interp1_np'
    air.lat,  # as above
    newlat,  # as above
    input_core_dims=[["lat"], ["lat"], ["new_lat"]],  # list with one entry per arg
    output_core_dims=[["new_lat"]],  # returned data has one dimension
    exclude_dims=set(("lat",)),  # dimensions allowed to change size. Must be a set!
    # vectorize=True,  # not needed since numba takes care of vectorizing
    dask="parallelized",
    output_dtypes=[air.dtype],  # one per output
).rename({"new_lat": "lat"})
interped["lat"] = newlat  # need to add this manually
xr.testing.assert_allclose(expected.transpose(*interped.dims), interped)

Yay! Our function is receiving 1D vectors and is working automatically with dask arrays. Finally let’s comment out the print line and wrap everything up in a nice reusable function

[15]:
from numba import float64, guvectorize


@guvectorize(
    "(float64[:], float64[:], float64[:], float64[:])",
    "(n), (n), (m) -> (m)",
    nopython=True,
)
def interp1d_np_gufunc(data, x, xi, out):
    out[:] = np.interp(xi, x, data)


def xr_interp(data, dim, newdim):
    interped = xr.apply_ufunc(
        interp1d_np_gufunc,  # first the function
        data,  # now arguments in the order expected by 'interp1_np'
        data[dim],  # as above
        newdim,  # as above
        input_core_dims=[[dim], [dim], ["__newdim__"]],  # list with one entry per arg
        output_core_dims=[["__newdim__"]],  # returned data has one dimension
        exclude_dims=set((dim,)),  # dimensions allowed to change size. Must be a set!
        # vectorize=True,  # not needed since numba takes care of vectorizing
        dask="parallelized",
        output_dtypes=[
            data.dtype
        ],  # one per output; could also be float or np.dtype("float64")
    ).rename({"__newdim__": dim})
    interped[dim] = newdim  # need to add this manually

    return interped


xr.testing.assert_allclose(
    expected.transpose(*interped.dims),
    xr_interp(air.chunk({"time": 2, "lon": 2}), "lat", newlat),
)

This technique is generalizable to any 1D function.