Skip to content

Commit 1ffe5e9

Browse files
committed
Merge branch 'main' into backend-indexing
* main: (79 commits) fix mean for datetime-like using the respective time resolution unit (#9977) Add `time_unit` argument to `CFTimeIndex.to_datetimeindex` (#9965) remove gate and add a test (#9958) Remove repetitive that (replace it with the) (#9994) add shxarray to the xarray ecosystem list (#9995) Add `shards` to `valid_encodings` to enable sharded Zarr writing (#9948) Use flox for grouped first, last (#9986) Bump the actions group with 2 updates (#9989) Fix some typing (#9988) Remove unnecessary a article (#9980) Fix test_doc_example on big-endian systems (#9949) fix weighted polyfit for arrays with more than 2 dimensions (#9974) Use zarr-fixture to prevent thread leakage errors (#9967) remove dask-expr from CI runs, fix related tests (#9971) Update time coding tests to assert exact equality (#9961) cast type to PDDatetimeUnitOptions (#9963) Suggest the correct name when no key matches in the dataset (#9943) fix upstream dev issues (#9953) Relax nanosecond datetime restriction in CF time decoding (#9618) Remove outdated quantile test. (#9945) ...
2 parents fb24e9c + e28f171 commit 1ffe5e9

File tree

109 files changed

+5158
-2131
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

109 files changed

+5158
-2131
lines changed

.github/workflows/ci-additional.yaml

+4-4
Original file line numberDiff line numberDiff line change
@@ -123,7 +123,7 @@ jobs:
123123
python -m mypy --install-types --non-interactive --cobertura-xml-report mypy_report
124124
125125
- name: Upload mypy coverage to Codecov
126-
uses: codecov/codecov-action@v5.0.2
126+
uses: codecov/codecov-action@v5.3.1
127127
with:
128128
file: mypy_report/cobertura.xml
129129
flags: mypy
@@ -174,7 +174,7 @@ jobs:
174174
python -m mypy --install-types --non-interactive --cobertura-xml-report mypy_report
175175
176176
- name: Upload mypy coverage to Codecov
177-
uses: codecov/codecov-action@v5.0.2
177+
uses: codecov/codecov-action@v5.3.1
178178
with:
179179
file: mypy_report/cobertura.xml
180180
flags: mypy-min
@@ -230,7 +230,7 @@ jobs:
230230
python -m pyright xarray/
231231
232232
- name: Upload pyright coverage to Codecov
233-
uses: codecov/codecov-action@v5.0.2
233+
uses: codecov/codecov-action@v5.3.1
234234
with:
235235
file: pyright_report/cobertura.xml
236236
flags: pyright
@@ -286,7 +286,7 @@ jobs:
286286
python -m pyright xarray/
287287
288288
- name: Upload pyright coverage to Codecov
289-
uses: codecov/codecov-action@v5.0.2
289+
uses: codecov/codecov-action@v5.3.1
290290
with:
291291
file: pyright_report/cobertura.xml
292292
flags: pyright39

.github/workflows/ci.yaml

+3-1
Original file line numberDiff line numberDiff line change
@@ -159,7 +159,9 @@ jobs:
159159
path: pytest.xml
160160

161161
- name: Upload code coverage to Codecov
162-
uses: codecov/[email protected]
162+
uses: codecov/[email protected]
163+
env:
164+
CODECOV_TOKEN: ${{ secrets.CODECOV_TOKEN }}
163165
with:
164166
file: ./coverage.xml
165167
flags: unittests

.github/workflows/pypi-release.yaml

+2-2
Original file line numberDiff line numberDiff line change
@@ -88,7 +88,7 @@ jobs:
8888
path: dist
8989
- name: Publish package to TestPyPI
9090
if: github.event_name == 'push'
91-
uses: pypa/[email protected].2
91+
uses: pypa/[email protected].4
9292
with:
9393
repository_url: https://test.pypi.org/legacy/
9494
verbose: true
@@ -110,6 +110,6 @@ jobs:
110110
name: releases
111111
path: dist
112112
- name: Publish package to PyPI
113-
uses: pypa/[email protected].2
113+
uses: pypa/[email protected].4
114114
with:
115115
verbose: true

.github/workflows/upstream-dev-ci.yaml

+1-1
Original file line numberDiff line numberDiff line change
@@ -140,7 +140,7 @@ jobs:
140140
run: |
141141
python -m mypy --install-types --non-interactive --cobertura-xml-report mypy_report
142142
- name: Upload mypy coverage to Codecov
143-
uses: codecov/codecov-action@v5.0.2
143+
uses: codecov/codecov-action@v5.3.1
144144
with:
145145
file: mypy_report/cobertura.xml
146146
flags: mypy

.pre-commit-config.yaml

+13-3
Original file line numberDiff line numberDiff line change
@@ -25,7 +25,7 @@ repos:
2525
- id: text-unicode-replacement-char
2626
- repo: https://github.com/astral-sh/ruff-pre-commit
2727
# Ruff version.
28-
rev: v0.7.2
28+
rev: v0.8.6
2929
hooks:
3030
- id: ruff-format
3131
- id: ruff
@@ -37,12 +37,12 @@ repos:
3737
exclude: "generate_aggregations.py"
3838
additional_dependencies: ["black==24.8.0"]
3939
- repo: https://github.com/rbubley/mirrors-prettier
40-
rev: v3.3.3
40+
rev: v3.4.2
4141
hooks:
4242
- id: prettier
4343
args: [--cache-location=.prettier_cache/cache]
4444
- repo: https://github.com/pre-commit/mirrors-mypy
45-
rev: v1.13.0
45+
rev: v1.14.1
4646
hooks:
4747
- id: mypy
4848
# Copied from setup.cfg
@@ -63,3 +63,13 @@ repos:
6363
rev: ebf0b5e44d67f8beaa1cd13a0d0393ea04c6058d
6464
hooks:
6565
- id: validate-cff
66+
- repo: https://github.com/ComPWA/taplo-pre-commit
67+
rev: v0.9.3
68+
hooks:
69+
- id: taplo-format
70+
args: ["--option", "array_auto_collapse=false"]
71+
- repo: https://github.com/abravalheri/validate-pyproject
72+
rev: v0.23
73+
hooks:
74+
- id: validate-pyproject
75+
additional_dependencies: ["validate-pyproject-schema-store[all]"]

.readthedocs.yaml

+4-3
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,9 @@
11
version: 2
22

3+
sphinx:
4+
configuration: doc/conf.py
5+
fail_on_warning: true
6+
37
build:
48
os: ubuntu-lts-latest
59
tools:
@@ -14,7 +18,4 @@ build:
1418
conda:
1519
environment: ci/requirements/doc.yml
1620

17-
sphinx:
18-
fail_on_warning: true
19-
2021
formats: []

DATATREE_MIGRATION_GUIDE.md

+1
Original file line numberDiff line numberDiff line change
@@ -45,6 +45,7 @@ A number of other API changes have been made, which should only require minor mo
4545
- The `DataTree.parent` property is now read-only. To assign a ancestral relationships directly you must instead use the `.children` property on the parent node, which remains settable.
4646
- Similarly the `parent` kwarg has been removed from the `DataTree.__init__` constructor.
4747
- DataTree objects passed to the `children` kwarg in `DataTree.__init__` are now shallow-copied.
48+
- `DataTree.map_over_subtree` has been renamed to `DataTree.map_over_datasets`, and changed to no longer work like a decorator. Instead you use it to apply the function and arguments directly, more like how `xarray.apply_ufunc` works.
4849
- `DataTree.as_array` has been replaced by `DataTree.to_dataarray`.
4950
- A number of methods which were not well tested have been (temporarily) disabled. In general we have tried to only keep things that are known to work, with the plan to increase API surface incrementally after release.
5051

asv_bench/benchmarks/__init__.py

+4-4
Original file line numberDiff line numberDiff line change
@@ -30,13 +30,13 @@ def requires_sparse():
3030

3131

3232
def randn(shape, frac_nan=None, chunks=None, seed=0):
33-
rng = np.random.RandomState(seed)
33+
rng = np.random.default_rng(seed)
3434
if chunks is None:
3535
x = rng.standard_normal(shape)
3636
else:
3737
import dask.array as da
3838

39-
rng = da.random.RandomState(seed)
39+
rng = da.random.default_rng(seed)
4040
x = rng.standard_normal(shape, chunks=chunks)
4141

4242
if frac_nan is not None:
@@ -47,8 +47,8 @@ def randn(shape, frac_nan=None, chunks=None, seed=0):
4747

4848

4949
def randint(low, high=None, size=None, frac_minus=None, seed=0):
50-
rng = np.random.RandomState(seed)
51-
x = rng.randint(low, high, size)
50+
rng = np.random.default_rng(seed)
51+
x = rng.integers(low, high, size)
5252
if frac_minus is not None:
5353
inds = rng.choice(range(x.size), int(x.size * frac_minus))
5454
x.flat[inds] = -1

asv_bench/benchmarks/dataset_io.py

+1-1
Original file line numberDiff line numberDiff line change
@@ -305,7 +305,7 @@ def make_ds(self, nfiles=10):
305305
ds.attrs = {"history": "created for xarray benchmarking"}
306306

307307
self.ds_list.append(ds)
308-
self.filenames_list.append("test_netcdf_%i.nc" % i)
308+
self.filenames_list.append(f"test_netcdf_{i}.nc")
309309

310310

311311
class IOWriteMultipleNetCDF3(IOMultipleNetCDF):

asv_bench/benchmarks/reindexing.py

+1-1
Original file line numberDiff line numberDiff line change
@@ -11,7 +11,7 @@
1111

1212
class Reindex:
1313
def setup(self):
14-
data = np.random.RandomState(0).randn(ntime, nx, ny)
14+
data = np.random.default_rng(0).random((ntime, nx, ny))
1515
self.ds = xr.Dataset(
1616
{"temperature": (("time", "x", "y"), data)},
1717
coords={"time": np.arange(ntime), "x": np.arange(nx), "y": np.arange(ny)},

asv_bench/benchmarks/rolling.py

+4-1
Original file line numberDiff line numberDiff line change
@@ -3,7 +3,7 @@
33

44
import xarray as xr
55

6-
from . import parameterized, randn, requires_dask
6+
from . import _skip_slow, parameterized, randn, requires_dask
77

88
nx = 3000
99
long_nx = 30000
@@ -80,6 +80,9 @@ def time_rolling_construct(self, center, stride, use_bottleneck):
8080
class RollingDask(Rolling):
8181
def setup(self, *args, **kwargs):
8282
requires_dask()
83+
# TODO: Lazily skipped in CI as it is very demanding and slow.
84+
# Improve times and remove errors.
85+
_skip_slow()
8386
super().setup(**kwargs)
8487
self.ds = self.ds.chunk({"x": 100, "y": 50, "t": 50})
8588
self.da_long = self.da_long.chunk({"x": 10000})

asv_bench/benchmarks/unstacking.py

+1-1
Original file line numberDiff line numberDiff line change
@@ -8,7 +8,7 @@
88

99
class Unstacking:
1010
def setup(self):
11-
data = np.random.RandomState(0).randn(250, 500)
11+
data = np.random.default_rng(0).random((250, 500))
1212
self.da_full = xr.DataArray(data, dims=list("ab")).stack(flat_dim=[...])
1313
self.da_missing = self.da_full[:-1]
1414
self.df_missing = self.da_missing.to_pandas()

ci/requirements/all-but-numba.yml

-1
Original file line numberDiff line numberDiff line change
@@ -12,7 +12,6 @@ dependencies:
1212
- cartopy
1313
- cftime
1414
- dask-core
15-
- dask-expr # dask raises a deprecation warning without this, breaking doctests
1615
- distributed
1716
- flox
1817
- fsspec

ci/requirements/doc.yml

+2-1
Original file line numberDiff line numberDiff line change
@@ -10,7 +10,6 @@ dependencies:
1010
- cfgrib
1111
- kerchunk
1212
- dask-core>=2022.1
13-
- dask-expr
1413
- hypothesis>=6.75.8
1514
- h5netcdf>=0.13
1615
- ipykernel
@@ -20,6 +19,7 @@ dependencies:
2019
- jupyter_client
2120
- matplotlib-base
2221
- nbsphinx
22+
- ncdata
2323
- netcdf4>=1.5
2424
- numba
2525
- numpy>=2
@@ -30,6 +30,7 @@ dependencies:
3030
- pre-commit
3131
- pyarrow
3232
- pyproj
33+
- rich # for Zarr tree()
3334
- scipy!=1.10.0
3435
- seaborn
3536
- setuptools

ci/requirements/environment-3.13.yml

+2-1
Original file line numberDiff line numberDiff line change
@@ -10,7 +10,6 @@ dependencies:
1010
- cartopy
1111
- cftime
1212
- dask-core
13-
- dask-expr
1413
- distributed
1514
- flox
1615
- fsspec
@@ -47,3 +46,5 @@ dependencies:
4746
- toolz
4847
- typing_extensions
4948
- zarr
49+
- pip:
50+
- jax # no way to get cpu-only jaxlib from conda if gpu is present

ci/requirements/environment-windows-3.13.yml

+1-1
Original file line numberDiff line numberDiff line change
@@ -8,7 +8,6 @@ dependencies:
88
- cartopy
99
- cftime
1010
- dask-core
11-
- dask-expr
1211
- distributed
1312
- flox
1413
- fsspec
@@ -29,6 +28,7 @@ dependencies:
2928
# - pint>=0.22
3029
- pip
3130
- pre-commit
31+
- pyarrow # importing dask.dataframe raises an ImportError without this
3232
- pydap
3333
- pytest
3434
- pytest-cov

ci/requirements/environment-windows.yml

+1-1
Original file line numberDiff line numberDiff line change
@@ -8,7 +8,6 @@ dependencies:
88
- cartopy
99
- cftime
1010
- dask-core
11-
- dask-expr
1211
- distributed
1312
- flox
1413
- fsspec
@@ -29,6 +28,7 @@ dependencies:
2928
# - pint>=0.22
3029
- pip
3130
- pre-commit
31+
- pyarrow # importing dask.dataframe raises an ImportError without this
3232
- pydap
3333
- pytest
3434
- pytest-cov

ci/requirements/environment.yml

+2-1
Original file line numberDiff line numberDiff line change
@@ -10,7 +10,6 @@ dependencies:
1010
- cartopy
1111
- cftime
1212
- dask-core
13-
- dask-expr # dask raises a deprecation warning without this, breaking doctests
1413
- distributed
1514
- flox
1615
- fsspec
@@ -49,3 +48,5 @@ dependencies:
4948
- toolz
5049
- typing_extensions
5150
- zarr
51+
- pip:
52+
- jax # no way to get cpu-only jaxlib from conda if gpu is present

doc/api.rst

+16
Original file line numberDiff line numberDiff line change
@@ -626,12 +626,14 @@ Attributes relating to the recursive tree-like structure of a ``DataTree``.
626626
DataTree.depth
627627
DataTree.width
628628
DataTree.subtree
629+
DataTree.subtree_with_keys
629630
DataTree.descendants
630631
DataTree.siblings
631632
DataTree.lineage
632633
DataTree.parents
633634
DataTree.ancestors
634635
DataTree.groups
636+
DataTree.xindexes
635637

636638
Data Contents
637639
-------------
@@ -645,6 +647,7 @@ This interface echoes that of ``xarray.Dataset``.
645647
DataTree.dims
646648
DataTree.sizes
647649
DataTree.data_vars
650+
DataTree.ds
648651
DataTree.coords
649652
DataTree.attrs
650653
DataTree.encoding
@@ -1093,6 +1096,17 @@ DataTree methods
10931096
.. Missing:
10941097
.. ``open_mfdatatree``
10951098
1099+
Encoding/Decoding
1100+
=================
1101+
1102+
Coder objects
1103+
-------------
1104+
1105+
.. autosummary::
1106+
:toctree: generated/
1107+
1108+
coders.CFDatetimeCoder
1109+
10961110
Coordinates objects
10971111
===================
10981112

@@ -1210,6 +1224,7 @@ Dataset
12101224
DatasetGroupBy.var
12111225
DatasetGroupBy.dims
12121226
DatasetGroupBy.groups
1227+
DatasetGroupBy.shuffle_to_chunks
12131228

12141229
DataArray
12151230
---------
@@ -1241,6 +1256,7 @@ DataArray
12411256
DataArrayGroupBy.var
12421257
DataArrayGroupBy.dims
12431258
DataArrayGroupBy.groups
1259+
DataArrayGroupBy.shuffle_to_chunks
12441260

12451261
Grouper Objects
12461262
---------------

doc/ecosystem.rst

+1
Original file line numberDiff line numberDiff line change
@@ -38,6 +38,7 @@ Geosciences
3838
- `salem <https://salem.readthedocs.io>`_: Adds geolocalised subsetting, masking, and plotting operations to xarray's data structures via accessors.
3939
- `SatPy <https://satpy.readthedocs.io/>`_ : Library for reading and manipulating meteorological remote sensing data and writing it to various image and data file formats.
4040
- `SARXarray <https://tudelftgeodesy.github.io/sarxarray/>`_: xarray extension for reading and processing large Synthetic Aperture Radar (SAR) data stacks.
41+
- `shxarray <https://shxarray.wobbly.earth/>`_: Convert, filter,and map geodesy related spherical harmonic representations of gravity and terrestrial water storage through an xarray extension.
4142
- `Spyfit <https://spyfit.readthedocs.io/en/master/>`_: FTIR spectroscopy of the atmosphere
4243
- `windspharm <https://ajdawson.github.io/windspharm/index.html>`_: Spherical
4344
harmonic wind analysis in Python.

doc/getting-started-guide/faq.rst

+3-3
Original file line numberDiff line numberDiff line change
@@ -173,9 +173,9 @@ integration with Cartopy_.
173173

174174
We think the design decisions we have made for xarray (namely, basing it on
175175
pandas) make it a faster and more flexible data analysis tool. That said, Iris
176-
has some great domain specific functionality, and xarray includes
177-
methods for converting back and forth between xarray and Iris. See
178-
:py:meth:`~xarray.DataArray.to_iris` for more details.
176+
has some great domain specific functionality, and there are dedicated methods for
177+
converting back and forth between xarray and Iris. See
178+
:ref:`Reading and Writing Iris data <io.iris>` for more details.
179179

180180
What other projects leverage xarray?
181181
------------------------------------

doc/internals/index.rst

+1
Original file line numberDiff line numberDiff line change
@@ -26,3 +26,4 @@ The pages in this section are intended for:
2626
how-to-add-new-backend
2727
how-to-create-custom-index
2828
zarr-encoding-spec
29+
time-coding

0 commit comments

Comments
 (0)