Skip to content

Changes for Python 3.13 support #569

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 7 commits into from
Jul 8, 2025
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
14 changes: 7 additions & 7 deletions .github/workflows/unit-tests-workflow.yml
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@ jobs:
test:
strategy:
matrix:
python-version: ['3.8', '3.9', '3.10', '3.11']
python-version: ['3.10', '3.11', '3.12', '3.13']

runs-on: ubuntu-latest

Expand All @@ -23,13 +23,13 @@ jobs:
with:
python-version: ${{ matrix.python-version }}

- name: Install dependencies
run: python -m pip install --upgrade pip poetry
- name: Install uv
uses: astral-sh/setup-uv@v3
with:
version: "latest"

- name: Set up project
run: |
cd $GITHUB_WORKSPACE
poetry install
run: uv sync --dev

- name: Run unit tests
run: poetry run pytest
run: uv run pytest
6 changes: 2 additions & 4 deletions CONTRIBUTING.md
Original file line number Diff line number Diff line change
Expand Up @@ -68,9 +68,7 @@ Follow these rules and you should succeed without a problem.
### Run the tests
Before you submit a pull request, please run the entire test suite via:

`$ export NUMBA_DISABLE_JIT=1`
`$ python setup.py test`
`$ unset NUMBA_DISABLE_JIT`
`$ uv run pytest`

The first thing the core committers will do is run this command. Any pull request that fails this test suite will be rejected.

Expand Down Expand Up @@ -114,7 +112,7 @@ First we pull the code into a local branch:

Then we run the tests:

`$ python -m unittest tests/test_*.py`
`$ uv run pytest`

We finish with a merge and push to GitHub:

Expand Down
47 changes: 47 additions & 0 deletions Dockerfile
Original file line number Diff line number Diff line change
@@ -0,0 +1,47 @@
FROM python:3.11-slim AS builder

LABEL authors="[email protected]"

# Install uv
COPY --from=ghcr.io/astral-sh/uv:latest /uv /bin/uv

# Set working directory
WORKDIR /app

# Copy dependency files
COPY pyproject.toml uv.lock ./

# Install dependencies
RUN uv sync --frozen --no-dev

# Production stage
FROM python:3.11-slim

# Install runtime dependencies
RUN apt-get update && apt-get --no-install-recommends install -y \
build-essential \
libhdf5-dev \
libnetcdf-dev \
&& rm -rf /var/lib/apt/lists/*

# Copy uv and virtual environment from builder
COPY --from=builder /app/.venv /app/.venv
COPY --from=builder /bin/uv /bin/uv

# Set working directory
WORKDIR /app

# Copy source code
COPY src/ ./src/
COPY pyproject.toml ./

# Create non-root user
RUN useradd --create-home --shell /bin/bash climate
USER climate

# Set environment variables
ENV PATH="/app/.venv/bin:$PATH"
ENV PYTHONPATH="/app/src:$PYTHONPATH"

# Set entrypoint
ENTRYPOINT ["python", "-m", "climate_indices"]
1 change: 0 additions & 1 deletion docs/conf.py
Original file line number Diff line number Diff line change
@@ -1,4 +1,3 @@
# -*- coding: utf-8 -*-
#
# Configuration file for the Sphinx documentation builder.
#
Expand Down
27 changes: 14 additions & 13 deletions docs/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -64,14 +64,17 @@ Quick Start
::

# create and activate a Python virtual environment with conda
conda create -n myvenv poetry pytest
conda create -n myvenv python=3.10
conda activate myvenv

# install uv
pip install uv

# install the package
python -m poetry install
uv sync --dev

# optionally run the unit tests suite
python -m poetry run pytest
uv run pytest


Installation
Expand All @@ -81,35 +84,33 @@ From PyPI

Install directly from PyPI::

python -m pip install climate-indices
uv add climate-indices

From source
^^^^^^^^^^^^^^

In order to build and install the package from source we need to first install `poetry <https://python-poetry.org/>`__::
In order to build and install the package from source we need to first install `uv <https://docs.astral.sh/uv/>`__::

python -m pip install poetry
pip install uv

Then install the package from source::

python -m poetry install
uv sync --dev

Next (optional) run the unit test suite to validate the installation::

python -m pytest tests
uv run pytest

the above should display output similar to this::

======================= 38 passed, 18 warnings in 12.19s =======================

Finally, show the package installed into the environment::

conda list climate-indices
uv list | grep climate-indices

# packages in environment at /Users/jadams/miniconda3/envs/climate381:
#
# Name Version Build Channel
climate-indices 1.0.16 pypi_0 pypi
# climate-indices v2.1.0 (editable)
# + climate-indices==2.1.0 (from file:///path/to/climate_indices)



Expand Down
7 changes: 2 additions & 5 deletions notebooks/concurrent_shared_memory_example.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -8,11 +8,8 @@
"source": [
"import concurrent.futures\n",
"from multiprocessing import cpu_count, shared_memory\n",
"import sys\n",
"from typing import Dict\n",
"\n",
"import numpy as np\n",
"import xarray as xr"
"import numpy as np"
]
},
{
Expand Down Expand Up @@ -59,7 +56,7 @@
"outputs": [],
"source": [
"def shm_add_average(\n",
" arguments: Dict,\n",
" arguments: dict,\n",
"):\n",
" existing_shm_input = shared_memory.SharedMemory(name=arguments[\"shm_name_input\"])\n",
" shm_ary_input = np.ndarray(\n",
Expand Down
22 changes: 6 additions & 16 deletions notebooks/generate_fitting_parameters_nclimgrid.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -7,16 +7,14 @@
"outputs": [],
"source": [
"import sys\n",
"from typing import Dict\n",
"\n",
"import matplotlib\n",
"import numpy as np\n",
"import xarray as xr\n",
"\n",
"climate_indices_home_path = \"/home/james/git/climate_indices\"\n",
"if climate_indices_home_path not in sys.path:\n",
" sys.path.insert(climate_indices_home_path)\n",
"from climate_indices import compute, indices, utils\n",
"from climate_indices import compute, indices\n",
"\n",
"%matplotlib inline"
]
Expand Down Expand Up @@ -115,7 +113,7 @@
"source": [
"def compute_gammas(\n",
" da_precip: xr.DataArray,\n",
" gamma_coords: Dict,\n",
" gamma_coords: dict,\n",
" scale: int,\n",
" calibration_year_initial,\n",
" calibration_year_final,\n",
Expand All @@ -139,9 +137,7 @@
" values = da_precip[lat_index, lon_index]\n",
"\n",
" # skip over this grid cell if all NaN values\n",
" if (np.ma.is_masked(values) and values.mask.all()) or np.all(\n",
" np.isnan(values)\n",
" ):\n",
" if (np.ma.is_masked(values) and values.mask.all()) or np.all(np.isnan(values)):\n",
" continue\n",
"\n",
" # convolve to scale\n",
Expand Down Expand Up @@ -220,9 +216,7 @@
" values = da_precip[lat_index, lon_index]\n",
"\n",
" # skip over this grid cell if all NaN values\n",
" if (np.ma.is_masked(values) and values.mask.all()) or np.all(\n",
" np.isnan(values)\n",
" ):\n",
" if (np.ma.is_masked(values) and values.mask.all()) or np.all(np.isnan(values)):\n",
" continue\n",
"\n",
" gamma_parameters = {\n",
Expand Down Expand Up @@ -289,9 +283,7 @@
" \"geospatial_lat_units\",\n",
" \"geospatial_lon_units\",\n",
"]\n",
"global_attrs = {\n",
" key: value for (key, value) in ds_prcp.attrs.items() if key in attrs_to_copy\n",
"}"
"global_attrs = {key: value for (key, value) in ds_prcp.attrs.items() if key in attrs_to_copy}"
]
},
{
Expand Down Expand Up @@ -437,9 +429,7 @@
" values = da_precip[lat_index, lon_index]\n",
"\n",
" # skip over this grid cell if all NaN values\n",
" if (np.ma.is_masked(values) and values.mask.all()) or np.all(\n",
" np.isnan(values)\n",
" ):\n",
" if (np.ma.is_masked(values) and values.mask.all()) or np.all(np.isnan(values)):\n",
" continue\n",
"\n",
" # compute the SPI\n",
Expand Down
19 changes: 5 additions & 14 deletions notebooks/spi_simple.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -170,7 +170,7 @@
"metadata": {},
"outputs": [],
"source": [
"from climate_indices.compute import scale_values, Periodicity"
"from climate_indices.compute import Periodicity, scale_values"
]
},
{
Expand Down Expand Up @@ -328,9 +328,7 @@
"# do it \"by hand\" by looping over each lat/lon point\n",
"for lat_index in range(len(da_one_more_lo_looper[\"lat\"])):\n",
" for lon_index in range(len(da_one_more_lo_looper[\"lon\"])):\n",
" da_one_more_lo_looper[lat_index, lon_index] = add_one(\n",
" da_one_more_lo_looper[lat_index, lon_index]\n",
" )"
" da_one_more_lo_looper[lat_index, lon_index] = add_one(da_one_more_lo_looper[lat_index, lon_index])"
]
},
{
Expand Down Expand Up @@ -363,9 +361,7 @@
"# do it \"by hand\" by looping over each lat/lon point\n",
"for lat_index in range(len(da_one_more_hi_looper[\"lat\"])):\n",
" for lon_index in range(len(da_one_more_hi_looper[\"lon\"])):\n",
" da_one_more_hi_looper[lat_index, lon_index] = add_one(\n",
" da_one_more_hi_looper[lat_index, lon_index]\n",
" )"
" da_one_more_hi_looper[lat_index, lon_index] = add_one(da_one_more_hi_looper[lat_index, lon_index])"
]
},
{
Expand Down Expand Up @@ -426,10 +422,7 @@
" if len(shape) == 2:\n",
" values = values.flatten()\n",
" elif len(shape) != 1:\n",
" message = (\n",
" \"Invalid shape of input array: {shape}\".format(shape=shape)\n",
" + \" -- only 1-D and 2-D arrays are supported\"\n",
" )\n",
" message = f\"Invalid shape of input array: {shape}\" + \" -- only 1-D and 2-D arrays are supported\"\n",
" _logger.error(message)\n",
" raise ValueError(message)\n",
"\n",
Expand Down Expand Up @@ -484,9 +477,7 @@
" )\n",
"\n",
" else:\n",
" message = \"Unsupported distribution argument: \" + \"{dist}\".format(\n",
" dist=distribution\n",
" )\n",
" message = \"Unsupported distribution argument: \" + f\"{distribution}\"\n",
" _logger.error(message)\n",
" raise ValueError(message)\n",
"\n",
Expand Down
Loading