Description
Problem description
To speed up docker image building, it is fairly typical to separate "things that change often", to "things that don't".
For dependency management, this usually means it is recommended to install dependencies in a docker layer first (as they do not change often), then install the local package(s) (as it changes at ~each build, so a lot more often).
A very simple python Dockerfile may look like
FROM python:3.12.2-slim
WORKDIR /app
COPY requirements.txt .
RUN pip install -r requirements.txt
COPY sample.py .
where requirements are installed before copying local code, to maximise cache hit.
It is not clear how something similar could be achieved with pixi currently. A couple of options that come to mind
- Not include local code in an environment, and use this environment to install dependencies. In that case, local code would need to be installed by something else than pixi, which does not feel ideal
- Have two environments, one 'dependencies-only' and one 'dependencies + local code' and start by installing one, then copy the code, then install the second one. this may be better, but feels like a bit of a hack, having two environments when we actually only need one, and pixi will do extra work to solve environment.
Another alternative (and better?) option could be extra options for pixi install. For instance --no-path-dependencies
and --only-path-dependencies
to respectively exclude path dependencies from solve & install, or deal only with them (ideally without doing any solve work, bypassing as many steps as possible to speed things up).
The Dockerfile may then look like
FROM python:3.12.2-slim
WORKDIR /app
COPY pixi.lock pixi.toml .
RUN pixi install --locked --no-path-dependencies
COPY sample-package .
RUN pixi install --locked --only-path-dependencies