Skip to content

fix: slim container image#314

Open
roivanov wants to merge 7 commits intoNVIDIA:mainfrom
roivanov:fix-313
Open

fix: slim container image#314
roivanov wants to merge 7 commits intoNVIDIA:mainfrom
roivanov:fix-313

Conversation

@roivanov
Copy link
Copy Markdown
Contributor

@roivanov roivanov commented Apr 1, 2026

This changes switches from rapidsai container image to the smaller miniconda image to remove potential dependency on the tools not needed for the build.

Summary by CodeRabbit

  • Chores
    • Simplified CI build matrix to reduce permutations and speed builds.
    • Added a job to fetch and inject standardized build tools and switched to a consistent base runtime for reproducible CI runs.
    • Introduced a pre-build bootstrap that installs/configures tooling and caching, enforces deterministic Python selection, and standardizes artifact paths for more stable package builds.

@coderabbitai
Copy link
Copy Markdown
Contributor

coderabbitai bot commented Apr 1, 2026

Note

Reviews paused

It looks like this branch is under active development. To avoid overwhelming you with review comments due to an influx of new commits, CodeRabbit has automatically paused this review. You can configure this behavior by changing the reviews.auto_review.auto_pause_after_reviewed_commits setting.

Use the following commands to manage reviews:

  • @coderabbitai resume to resume automatic reviews.
  • @coderabbitai review to trigger a single review.

Use the checkboxes below for quick actions:

  • ▶️ Resume reviews
  • 🔍 Trigger review
📝 Walkthrough

Walkthrough

The CI narrows the build matrix to ARCH and PY_VER, adds a fetch-gha-tools job that publishes a rapids-gha-tools artifact, and runs matrix builds in a fixed condaforge/miniforge3 container. The build script bootstraps apt packages, renders .condarc, installs/configures sccache, and installs an ABI-aware Python into conda base.

Changes

Cohort / File(s) Summary
Workflow Build Matrix & Job Orchestration
.github/workflows/conda-python-build.yaml
Matrix reduced to [ARCH, PY_VER]; added fetch-gha-tools job (checks out rapidsai/[email protected], uploads rapids-gha-tools artifact); build depends on compute-matrix and fetch-gha-tools; container fixed to condaforge/miniforge3:26.1.1-3; matrix jobs download artifact, add ${GITHUB_WORKSPACE}/tools to PATH, mark workspace safe for git, and set env vars RAPIDS_CONDA_BLD_ROOT_DIR/RAPIDS_CONDA_BLD_OUTPUT_DIR; SCCACHE_VER and PYTHON_VERSION set from matrix.
Build Script Bootstrap & Python ABI Handling
ci/build_conda_python.sh
Adds pre-build bootstrap: make tools/rapids-* executable; apt installs (jq, gh, unzip, curl, gettext); render remote condarc.tmpl to ~/.condarc via envsubst; install/configure sccache using SCCACHE_VER; update base conda env with rapids-mamba-retry and conda-build; compute ABI-aware Python selector (handles padded version and >=3.12), then install target Python into base before existing conda-build steps.

Sequence Diagram(s)

sequenceDiagram
  participant GH as "GitHub Actions"
  participant Fetch as "fetch-gha-tools"
  participant Artifact as "Artifact Storage"
  participant Matrix as "build (matrix)"
  participant Container as "miniforge container"
  participant Bootstrap as "bootstrap script"

  GH->>Fetch: run fetch-gha-tools job
  Fetch->>Artifact: checkout [email protected]\nupload rapids-gha-tools artifact
  GH->>Matrix: start matrix jobs (depend on fetch-gha-tools)
  Matrix->>Artifact: download rapids-gha-tools artifact
  Matrix->>Container: run in condaforge/miniforge3:26.1.1-3
  Container->>Bootstrap: make tools executable\napt installs\nrender .condarc\ninstall/configure sccache\ninstall ABI-aware Python
  Bootstrap->>Container: prepared conda base for conda-build
  Container->>GH: run conda-build steps\nupload build artifacts
Loading

Estimated code review effort

🎯 4 (Complex) | ⏱️ ~50 minutes

Poem

🐰 I hopped to fetch tools, then painted .condarc bright,
Tuned sccache and Python to snug the ABI right,
Miniforge warmed the burrow, builds scampered in a line,
Artifacts packed tidy — workflows hum and shine!

🚥 Pre-merge checks | ✅ 3
✅ Passed checks (3 passed)
Check name Status Explanation
Description Check ✅ Passed Check skipped - CodeRabbit’s high-level summary is enabled.
Title check ✅ Passed The title 'fix: slim container image' directly describes the main change: switching from rapidsai/ci-conda to condaforge/miniforge3 to reduce container size and dependencies.
Docstring Coverage ✅ Passed No functions found in the changed files to evaluate docstring coverage. Skipping docstring coverage check.

✏️ Tip: You can configure your own custom pre-merge checks in the settings.

✨ Finishing Touches
🧪 Generate unit tests (beta)
  • Create PR with unit tests

Comment @coderabbitai help to get the list of available commands and usage tips.

@roivanov roivanov changed the title fix: misc and update license fix: improve llvm caching Apr 1, 2026
@github-actions
Copy link
Copy Markdown

github-actions bot commented Apr 1, 2026

Doc Preview CI
🚀 View pre-built docs at
https://NVIDIA.github.io/numbast/pr-preview/pr-314/

Preview will be ready when GitHub Pages deployment finishes.

@roivanov roivanov force-pushed the fix-313 branch 2 times, most recently from 4df0d3f to 239fdab Compare April 2, 2026 03:04
@roivanov roivanov closed this Apr 2, 2026
@roivanov roivanov reopened this Apr 2, 2026
@roivanov roivanov changed the title fix: improve llvm caching fix: slim container image Apr 2, 2026
@roivanov roivanov marked this pull request as ready for review April 3, 2026 20:04
Copy link
Copy Markdown
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 3

🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.

Inline comments:
In @.github/workflows/conda-python-build.yaml:
- Around line 86-89: The checkout step that pulls rapidsai/gha-tools currently
uses an unstable branch ref ("ref: main"); update the actions/checkout
invocation that references repository: rapidsai/gha-tools to pin ref to an
immutable release tag or commit SHA (e.g., a specific tag like v0.0.198 or a
commit SHA) instead of "main" so the workflow reproducibly uses a fixed
gha-tools version.
- Around line 92-95: The workflow step that adds the tools directory to PATH is
pointing to the wrong location: replace the incorrect PATH entry
"${GITHUB_WORKSPACE}/tools" with the actual extracted location
"${GITHUB_WORKSPACE}/gha-tools/tools" in the step that modifies PATH (the step
that uses the PATH env or echo "::add-path::" pattern); ensure the same
correction is applied to the corresponding lines around the artifact extraction
(the step that downloads/extracts gha-tools) so all references consistently use
"${GITHUB_WORKSPACE}/gha-tools/tools".

In `@ci/build_conda_python.sh`:
- Around line 12-13: The curl command fetching the `.condarc` template from
rapidsai/ci-imgs/main is using the mutable main branch; update the script so the
raw URL is pinned to an immutable commit SHA (or replace the remote fetch by
reading a vendored local copy of the template) before running envsubst and tee.
Locate the curl pipeline in ci/build_conda_python.sh (the line using curl -fsSL
"https://raw.githubusercontent.com/rapidsai/ci-imgs/main/context/condarc.tmpl" |
envsubst | tee ~/.condarc) and change it to use a specific commit SHA in the
raw.githubusercontent.com URL or switch to reading a checked-in condarc.tmpl
file and then run envsubst | tee ~/.condarc.
🪄 Autofix (Beta)

Fix all unresolved CodeRabbit comments on this PR:

  • Push a commit to this branch (recommended)
  • Create a new PR with the fixes

ℹ️ Review info
⚙️ Run configuration

Configuration used: Path: .coderabbit.yaml

Review profile: ASSERTIVE

Plan: Pro

Run ID: e5fca8e8-33a4-4d1a-bcc6-164c05198109

📥 Commits

Reviewing files that changed from the base of the PR and between fd5e8ad and 4a0bc51.

📒 Files selected for processing (2)
  • .github/workflows/conda-python-build.yaml
  • ci/build_conda_python.sh

Comment on lines +92 to +95
- uses: actions/upload-artifact@v6
with:
name: rapids-gha-tools
path: gha-tools
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

🧩 Analysis chain

🏁 Script executed:

# First, let's check the actual workflow file
cat -n .github/workflows/conda-python-build.yaml | sed -n '85,135p'

Repository: NVIDIA/numbast

Length of output: 2325


🏁 Script executed:

# Check if pyproject.toml exists in the repo root
ls -la pyproject.toml 2>/dev/null || echo "No pyproject.toml in repo root"

Repository: NVIDIA/numbast

Length of output: 120


🏁 Script executed:

# Look for conda recipes that read pyproject.toml
find . -name "meta.yaml" -type f 2>/dev/null | head -20

Repository: NVIDIA/numbast

Length of output: 173


🏁 Script executed:

# Check the fetch-gha-tools job to understand what's being uploaded
cat -n .github/workflows/conda-python-build.yaml | grep -A 30 "fetch-gha-tools"

Repository: NVIDIA/numbast

Length of output: 3237


🌐 Web query:

GitHub Actions upload-artifact v6 download-artifact path extraction semantics

💡 Result:

In GitHub Actions, upload-artifact@v6 (released Dec 2025, Node.js 24 runtime) and download-artifact@v6 create and handle artifacts as immutable ZIP archives containing the specified files/directories with preserved relative path hierarchy from upload. Key semantics for path extraction on download: - Artifacts are ZIP archives of matched files/paths. - For upload-artifact@v6: - Single path: Full relative structure preserved. - Wildcards: Hierarchy preserved after first wildcard. - Multiple paths: Least common ancestor becomes ZIP root; excludes don't affect structure. - For download-artifact@v6: - Single artifact (by name or ID): Contents extracted directly to specified path/ (no extra artifact-named subdir). - Multiple artifacts (no name, or pattern matching >1): Each extracts to path// unless merge-multiple: true (merges to path/). - Default path: $GITHUB_WORKSPACE. - Supports ~ expansion. In v5 of download-artifact, single-ID downloads nested under artifact name (fixed in v5+ for consistency). Permissions: ZIP strips executables (dirs 755, files 644); use tar to preserve. Examples: # Upload preserves structure - uses: actions/upload-artifact@v6 with: name: build path: | dist/ !dist/*.tmp # Download single: extracts dist/ directly to path/ - uses: actions/download-artifact@v6 with: name: build path: ./output # results in ./output/dist/... # Download all: extracts to path// - uses: actions/download-artifact@v6 with: path: ./output # ./output/build/dist/... merge-multiple: true # merges all to ./output/dist/... Sources confirm v6 maintains v4+ behaviors with Node 24 updates; no path semantic changes from v4-v6.

Citations:


Fix the PATH reference to point to the correct location of the tools directory.

The artifact download at lines 123-126 extracts the gha-tools repository contents to a subdirectory within the workspace. However, line 128 attempts to add ${GITHUB_WORKSPACE}/tools to PATH, when the tools are actually located at ${GITHUB_WORKSPACE}/gha-tools/tools. This will cause the build script to fail when trying to locate the tools.

Proposed fix
      - uses: actions/download-artifact@v6
        with:
          name: rapids-gha-tools
          path: ${{ github.workspace }}
      - name: Add RAPIDS gha-tools to PATH
-       run: echo "${GITHUB_WORKSPACE}/tools" | tee -a "${GITHUB_PATH}"
+       run: echo "${GITHUB_WORKSPACE}/gha-tools/tools" | tee -a "${GITHUB_PATH}"

Also applies to: 123-128

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In @.github/workflows/conda-python-build.yaml around lines 92 - 95, The workflow
step that adds the tools directory to PATH is pointing to the wrong location:
replace the incorrect PATH entry "${GITHUB_WORKSPACE}/tools" with the actual
extracted location "${GITHUB_WORKSPACE}/gha-tools/tools" in the step that
modifies PATH (the step that uses the PATH env or echo "::add-path::" pattern);
ensure the same correction is applied to the corresponding lines around the
artifact extraction (the step that downloads/extracts gha-tools) so all
references consistently use "${GITHUB_WORKSPACE}/gha-tools/tools".

Copy link
Copy Markdown
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 3

🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.

Inline comments:
In @.github/workflows/conda-python-build.yaml:
- Line 108: The workflow currently uses an unpinned base image via the image
field "condaforge/miniforge3:latest"; replace that with a specific, immutable
reference (either a released version tag like "condaforge/miniforge3:23.07-1" or
a SHA256 digest) so the CI run is reproducible—update the image value where
"image: condaforge/miniforge3:latest" appears to the chosen tag or digest and
run the workflow to verify the runner uses the pinned image.

In `@ci/build_conda_python.sh`:
- Around line 26-31: The conditional using string comparison (if [[
"$PYTHON_VERSION_PADDED" > "3.12" ]]) is fragile; change it to perform numeric
comparison by comparing integer major/minor parts: parse PYTHON_MAJOR_VERSION
and PYTHON_MINOR_VERSION (or split PYTHON_VERSION_PADDED) and compare
PYTHON_MAJOR_VERSION first then PYTHON_MINOR_VERSION against 3 and 12
respectively, then set PYTHON_ABI_TAG as before
(PYTHON_ABI_TAG="cp${PYTHON_MAJOR_VERSION}${PYTHON_MINOR_VERSION}" else
"cpython"); ensure the install line using PYTHON_ABI_TAG remains unchanged.
- Line 31: The conda MatchSpec in the rapids-mamba-retry install line is
malformed; update the package spec used in the command that references
PYTHON_VERSION, PYTHON_UPPER_BOUND and PYTHON_ABI_TAG to use valid conda syntax
— either separate the build string from the version range (e.g.,
"python>=${PYTHON_VERSION},<${PYTHON_UPPER_BOUND} *_${PYTHON_ABI_TAG}") or use
bracket notation with explicit version and build fields (e.g.,
"python[version='>=${PYTHON_VERSION},<${PYTHON_UPPER_BOUND}',build='*_${PYTHON_ABI_TAG}']"),
and replace the current
`"python>=${PYTHON_VERSION},<${PYTHON_UPPER_BOUND}=*_${PYTHON_ABI_TAG}"` token
with one of these valid forms in the rapids-mamba-retry install invocation.
🪄 Autofix (Beta)

Fix all unresolved CodeRabbit comments on this PR:

  • Push a commit to this branch (recommended)
  • Create a new PR with the fixes

ℹ️ Review info
⚙️ Run configuration

Configuration used: Path: .coderabbit.yaml

Review profile: ASSERTIVE

Plan: Pro

Run ID: 542e83c2-38f3-432d-9cdd-1c3e426cd0d2

📥 Commits

Reviewing files that changed from the base of the PR and between 4a0bc51 and 9e8e741.

📒 Files selected for processing (2)
  • .github/workflows/conda-python-build.yaml
  • ci/build_conda_python.sh

Comment on lines +26 to +31
if [[ "$PYTHON_VERSION_PADDED" > "3.12" ]]; then
PYTHON_ABI_TAG="cp${PYTHON_MAJOR_VERSION}${PYTHON_MINOR_VERSION}"
else
PYTHON_ABI_TAG="cpython"
fi
rapids-mamba-retry install -y -n base "python>=${PYTHON_VERSION},<${PYTHON_UPPER_BOUND}=*_${PYTHON_ABI_TAG}"
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

Bash string comparison for version numbers is fragile.

The > operator in [[ ... ]] performs lexicographic (string) comparison, not numeric comparison. While "3.12" > "3.13" happens to work correctly, this pattern will break for edge cases like comparing "3.9" (which would incorrectly sort after "3.12" lexicographically since "9" > "1").

Given that the matrix includes versions 3.10–3.14, this currently works, but the logic is error-prone if 3.9 or earlier is ever added.

🔧 Proposed fix using numeric comparison
-PYTHON_MINOR_PADDED=$(printf "%02d" "$PYTHON_MINOR_VERSION")
-PYTHON_VERSION_PADDED="${PYTHON_MAJOR_VERSION}.${PYTHON_MINOR_PADDED}"
-if [[ "$PYTHON_VERSION_PADDED" > "3.12" ]]; then
+if [[ "$PYTHON_MAJOR_VERSION" -gt 3 ]] || { [[ "$PYTHON_MAJOR_VERSION" -eq 3 ]] && [[ "$PYTHON_MINOR_VERSION" -gt 12 ]]; }; then
     PYTHON_ABI_TAG="cp${PYTHON_MAJOR_VERSION}${PYTHON_MINOR_VERSION}"
 else
     PYTHON_ABI_TAG="cpython"
 fi
🧰 Tools
🪛 Shellcheck (0.11.0)

[error] 26-26: Decimals are not supported. Either use integers only, or use bc or awk to compare.

(SC2072)

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@ci/build_conda_python.sh` around lines 26 - 31, The conditional using string
comparison (if [[ "$PYTHON_VERSION_PADDED" > "3.12" ]]) is fragile; change it to
perform numeric comparison by comparing integer major/minor parts: parse
PYTHON_MAJOR_VERSION and PYTHON_MINOR_VERSION (or split PYTHON_VERSION_PADDED)
and compare PYTHON_MAJOR_VERSION first then PYTHON_MINOR_VERSION against 3 and
12 respectively, then set PYTHON_ABI_TAG as before
(PYTHON_ABI_TAG="cp${PYTHON_MAJOR_VERSION}${PYTHON_MINOR_VERSION}" else
"cpython"); ensure the install line using PYTHON_ABI_TAG remains unchanged.

else
PYTHON_ABI_TAG="cpython"
fi
rapids-mamba-retry install -y -n base "python>=${PYTHON_VERSION},<${PYTHON_UPPER_BOUND}=*_${PYTHON_ABI_TAG}"
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

🧩 Analysis chain

🌐 Web query:

conda install version constraint syntax with build string selector

💡 Result:

The syntax for conda install with version constraint and build string selector uses the MatchSpec format: conda install package==version=build_string for exact matches, or variations like package=version=build_string (single = before build in some contexts). For patterns, use wildcards like package=version=py3* or bracket notation package[build=py3*]. Examples: - Exact: conda install numpy==1.23.1=py39hc836922_0 - Version with build pattern: conda install numpy=1.23.1=py39* - Build selector: conda install numpy[build=py39*] - Exclude build: conda install numpy[build!=hc292b87_0] This works in conda install, conda search, and environment files. Use quotes for complex specs with spaces or operators: conda install "numpy>=1.23=py39*".

Citations:


🏁 Script executed:

cd ci && wc -l build_conda_python.sh

Repository: NVIDIA/numbast

Length of output: 82


🏁 Script executed:

cd ci && sed -n '25,35p' build_conda_python.sh

Repository: NVIDIA/numbast

Length of output: 453


🏁 Script executed:

rg "rapids-mamba-retry install" -A 2 -B 2

Repository: NVIDIA/numbast

Length of output: 356


🌐 Web query:

conda MatchSpec syntax multiple version constraints build string >= < operators

💡 Result:

Conda MatchSpec syntax supports multiple version constraints using operators like >=, <, <=, >, ==, !=, combined with logical operators , (AND) and | (OR). Build strings can be specified exactly or with wildcards/patterns. Key syntax elements: - Positional form: name version build - Examples: - numpy==1.21.0=py39h1234567_0 (exact version and build) - foo 1.0 py27_0 → foo==1.0=py27_0 - >=1.0 , < 2.0 py34_0 → version '>=1.0,<2.0' with build py34_0 - Bracket notation for explicit constraints: - conda-forge::foo[version='>=1.0'] - numpy[version='>=1.7,<2.0', build=py3*] - Version specifiers support: - Single: >=1.0, <2.0, ==1.2.3, 1.2.* (fuzzy) - Multiple: >=1.0,<2.0 (AND), >1.8,<2|==1.7 (AND then OR) - Note: Spaces around operators are ignored or normalized; no parentheses for grouping (may cause parsing issues). - Build string: Exact like =py27_0 or patterns like py2*, specified after version with = or in brackets [build=py2*]. Canonical form: (channel(/subdir):)name(version(build))[key=value,...] Usage examples: - conda install "numpy>=1.20,<1.22 py3*" - conda search conda-forge::python[version='>=3.9',build='h*'] This is based on conda's official MatchSpec documentation (conda 26.1.1) and source code, which confirm parsing of complex version+build specs like ">=1.0 , < 2.0 py34_0" and bracket syntax for operators.

Citations:


🏁 Script executed:

# Search for similar patterns in the codebase
rg "python>=" --type sh -A 1 -B 1

Repository: NVIDIA/numbast

Length of output: 245


Fix the conda version constraint syntax to properly specify the build string.

The version constraint "python>=${PYTHON_VERSION},<${PYTHON_UPPER_BOUND}=*_${PYTHON_ABI_TAG}" does not follow standard conda MatchSpec format. The build pattern =*_${PYTHON_ABI_TAG} is ambiguously attached only to the upper bound. Use standard syntax instead:

"python>=${PYTHON_VERSION},<${PYTHON_UPPER_BOUND} *_${PYTHON_ABI_TAG}"

Or use bracket notation for explicit constraints:

"python[version='${PYTHON_VERSION},<${PYTHON_UPPER_BOUND}', build='*_${PYTHON_ABI_TAG}']"
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@ci/build_conda_python.sh` at line 31, The conda MatchSpec in the
rapids-mamba-retry install line is malformed; update the package spec used in
the command that references PYTHON_VERSION, PYTHON_UPPER_BOUND and
PYTHON_ABI_TAG to use valid conda syntax — either separate the build string from
the version range (e.g., "python>=${PYTHON_VERSION},<${PYTHON_UPPER_BOUND}
*_${PYTHON_ABI_TAG}") or use bracket notation with explicit version and build
fields (e.g.,
"python[version='>=${PYTHON_VERSION},<${PYTHON_UPPER_BOUND}',build='*_${PYTHON_ABI_TAG}']"),
and replace the current
`"python>=${PYTHON_VERSION},<${PYTHON_UPPER_BOUND}=*_${PYTHON_ABI_TAG}"` token
with one of these valid forms in the rapids-mamba-retry install invocation.

Copy link
Copy Markdown
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 1

♻️ Duplicate comments (1)
.github/workflows/conda-python-build.yaml (1)

92-95: ⚠️ Potential issue | 🟠 Major

Artifact only the tools/ subtree here.

Line 95 currently uploads the gha-tools repo root, but ci/build_conda_python.sh:7 and Line 128 both consume the helpers from tools/ at workspace root. With the artifact rooted at gha-tools, the downloaded layout no longer matches those paths, so the build job won't find the rapids-* scripts. Upload gha-tools/tools instead, or update every downstream reference consistently.

🛠️ Proposed fix
      - uses: actions/upload-artifact@v6
        with:
          name: rapids-gha-tools
-         path: gha-tools
+         path: gha-tools/tools
          if-no-files-found: error
GitHub Actions official docs: when `actions/upload-artifact` uploads a directory path like `gha-tools`, does `actions/download-artifact` restore that top-level directory under the target path or flatten its contents? Please include official docs/examples.
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In @.github/workflows/conda-python-build.yaml around lines 92 - 95, The artifact
upload currently publishes the repo root "gha-tools" which breaks consumers
expecting the tools/ subtree; change the upload action in the workflow (the
actions/upload-artifact step that sets name: rapids-gha-tools and path:
gha-tools) to upload the tools directory instead (path: gha-tools/tools) so
downstream consumers such as ci/build_conda_python.sh (referenced at
ci/build_conda_python.sh:7) and the job step that reads line 128 will find the
expected rapids-* scripts, or alternatively update all downstream references to
point into the restored gha-tools/<something> layout consistently.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.

Inline comments:
In @.github/workflows/conda-python-build.yaml:
- Around line 151-152: The RAPIDS_CONDA_BLD_OUTPUT_DIR env var is currently set
only inside the build step so ci/upload_conda.sh (invoked when
inputs.upload_to_conda is true) cannot see it during the publish step; fix by
promoting RAPIDS_CONDA_BLD_OUTPUT_DIR to the job-level environment
(jobs.build.env) so both the build and publish steps can access it, or
alternatively add the same RAPIDS_CONDA_BLD_OUTPUT_DIR entry to the publish
step's env to ensure ci/upload_conda.sh:7 can read the output directory during
upload.

---

Duplicate comments:
In @.github/workflows/conda-python-build.yaml:
- Around line 92-95: The artifact upload currently publishes the repo root
"gha-tools" which breaks consumers expecting the tools/ subtree; change the
upload action in the workflow (the actions/upload-artifact step that sets name:
rapids-gha-tools and path: gha-tools) to upload the tools directory instead
(path: gha-tools/tools) so downstream consumers such as ci/build_conda_python.sh
(referenced at ci/build_conda_python.sh:7) and the job step that reads line 128
will find the expected rapids-* scripts, or alternatively update all downstream
references to point into the restored gha-tools/<something> layout consistently.
🪄 Autofix (Beta)

Fix all unresolved CodeRabbit comments on this PR:

  • Push a commit to this branch (recommended)
  • Create a new PR with the fixes

ℹ️ Review info
⚙️ Run configuration

Configuration used: Path: .coderabbit.yaml

Review profile: ASSERTIVE

Plan: Pro

Run ID: 829daadd-e120-4070-9545-527c6bbcd67f

📥 Commits

Reviewing files that changed from the base of the PR and between 9e8e741 and c207957.

📒 Files selected for processing (1)
  • .github/workflows/conda-python-build.yaml

Copy link
Copy Markdown
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 1

Caution

Some comments are outside the diff and can’t be posted inline due to platform limitations.

⚠️ Outside diff range comments (1)
.github/workflows/conda-python-build.yaml (1)

139-141: 🧹 Nitpick | 🔵 Trivial

Consider pinning setup-proxy-cache to a specific ref.

nv-gha-runners/setup-proxy-cache@main uses an unpinned branch ref, inconsistent with the pinning strategy applied to gha-tools. While continue-on-error: true mitigates failure risk, pinning improves reproducibility.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In @.github/workflows/conda-python-build.yaml around lines 139 - 141, The
workflow step "Setup proxy cache" currently uses an unpinned ref
nv-gha-runners/setup-proxy-cache@main; change the uses value to a specific tag
or commit SHA (for example nv-gha-runners/[email protected] or
@<commit-sha>) to pin the action, keeping the step name "Setup proxy cache" and
the existing continue-on-error: true; update any documentation or dependabot
config if you choose to use a tag versus a SHA and ensure the chosen ref is
tested in CI before merging.
♻️ Duplicate comments (1)
.github/workflows/conda-python-build.yaml (1)

124-132: ⚠️ Potential issue | 🔴 Critical

PATH reference points to wrong location.

The artifact is uploaded from gha-tools/ (line 95), preserving directory structure. When downloaded to ${{ github.workspace }}, the tools directory is at ${GITHUB_WORKSPACE}/gha-tools/tools/, not ${GITHUB_WORKSPACE}/tools/. The build will fail because rapids-* scripts won't be found.

Proposed fix
       - name: Add RAPIDS gha-tools to PATH
-        run: echo "${GITHUB_WORKSPACE}/tools" | tee -a "${GITHUB_PATH}"
+        run: echo "${GITHUB_WORKSPACE}/gha-tools/tools" | tee -a "${GITHUB_PATH}"
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In @.github/workflows/conda-python-build.yaml around lines 124 - 132, The PATH
step points to the wrong tools directory—after downloading the artifact with
actions/download-artifact (name: rapids-gha-tools) into ${{ github.workspace }},
the tools live under gha-tools/tools, not tools; update the "Add RAPIDS
gha-tools to PATH" step to append "${GITHUB_WORKSPACE}/gha-tools/tools" (instead
of "${GITHUB_WORKSPACE}/tools") to ${GITHUB_PATH} so the rapids-* scripts are
discoverable.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.

Inline comments:
In @.github/workflows/conda-python-build.yaml:
- Around line 157-162: The workflow uses two different upload-artifact versions;
update the earlier occurrence that reads "uses: actions/upload-artifact@v6" so
it matches the later "uses: actions/upload-artifact@v7" (or vice versa if you
prefer v6) so both upload steps use the same action version; locate both lines
containing "uses: actions/upload-artifact@v6" and "uses:
actions/upload-artifact@v7" and make them identical to ensure consistent
behavior.

---

Outside diff comments:
In @.github/workflows/conda-python-build.yaml:
- Around line 139-141: The workflow step "Setup proxy cache" currently uses an
unpinned ref nv-gha-runners/setup-proxy-cache@main; change the uses value to a
specific tag or commit SHA (for example nv-gha-runners/[email protected]
or @<commit-sha>) to pin the action, keeping the step name "Setup proxy cache"
and the existing continue-on-error: true; update any documentation or dependabot
config if you choose to use a tag versus a SHA and ensure the chosen ref is
tested in CI before merging.

---

Duplicate comments:
In @.github/workflows/conda-python-build.yaml:
- Around line 124-132: The PATH step points to the wrong tools directory—after
downloading the artifact with actions/download-artifact (name: rapids-gha-tools)
into ${{ github.workspace }}, the tools live under gha-tools/tools, not tools;
update the "Add RAPIDS gha-tools to PATH" step to append
"${GITHUB_WORKSPACE}/gha-tools/tools" (instead of "${GITHUB_WORKSPACE}/tools")
to ${GITHUB_PATH} so the rapids-* scripts are discoverable.
🪄 Autofix (Beta)

Fix all unresolved CodeRabbit comments on this PR:

  • Push a commit to this branch (recommended)
  • Create a new PR with the fixes

ℹ️ Review info
⚙️ Run configuration

Configuration used: Path: .coderabbit.yaml

Review profile: ASSERTIVE

Plan: Pro

Run ID: e5b8a9a9-162f-413c-9c21-17a374ee7d71

📥 Commits

Reviewing files that changed from the base of the PR and between c207957 and cea1693.

📒 Files selected for processing (1)
  • .github/workflows/conda-python-build.yaml

Copy link
Copy Markdown
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 1

🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.

Inline comments:
In @.github/workflows/conda-python-build.yaml:
- Around line 92-95: The workflow currently uploads the entire gha-tools
checkout artifact (name: rapids-gha-tools, path: gha-tools), which when
re-downloaded overwrites repo files and strips execute permissions for scripts
referenced on PATH; change the upload to only package the tools payload (e.g.,
tools/ subdirectory) so consumers add ${GITHUB_WORKSPACE}/tools to PATH safely
and scripts like rapids-find-anaconda-uploads.py and rapids-retry retain their
executable bits—update the actions/upload-artifact step to point its path to the
tools subdirectory and adjust any downstream download/restore steps accordingly.
🪄 Autofix (Beta)

Fix all unresolved CodeRabbit comments on this PR:

  • Push a commit to this branch (recommended)
  • Create a new PR with the fixes

ℹ️ Review info
⚙️ Run configuration

Configuration used: Path: .coderabbit.yaml

Review profile: ASSERTIVE

Plan: Pro

Run ID: a67c01a0-8dce-4794-82ec-fac19792721b

📥 Commits

Reviewing files that changed from the base of the PR and between cea1693 and 1935931.

📒 Files selected for processing (1)
  • .github/workflows/conda-python-build.yaml

Comment on lines +92 to +95
- uses: actions/upload-artifact@v7
with:
name: rapids-gha-tools
path: gha-tools
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

🧩 Analysis chain

🏁 Script executed:

#!/bin/bash
set -euo pipefail

echo "=== workflow excerpt ==="
sed -n '86,132p' .github/workflows/conda-python-build.yaml

echo
echo "=== direct RAPIDS tool invocations ==="
sed -n '1,14p' ci/upload_conda.sh

upload_doc=$(mktemp)
download_doc=$(mktemp)
gha_readme=$(mktemp)
trap 'rm -f "$upload_doc" "$download_doc" "$gha_readme"' EXIT

curl -fsSL https://raw.githubusercontent.com/actions/upload-artifact/main/README.md > "$upload_doc"
curl -fsSL https://raw.githubusercontent.com/actions/download-artifact/main/README.md > "$download_doc"
curl -fsSL https://raw.githubusercontent.com/rapidsai/gha-tools/v0.0.198/README.md > "$gha_readme"

echo
echo "=== upload-artifact permission semantics ==="
grep -nE 'Permission Loss|File permissions are not maintained|tar all of your files together' "$upload_doc"

echo
echo "=== download-artifact extraction semantics ==="
grep -nE 'Destination path|single artifact .*directly to the specified path|individual named directories' "$download_doc"

echo
echo "=== upstream gha-tools install guidance ==="
grep -nE 'git clone .*gha-tools|tools.tar.gz|GITHUB_PATH' "$gha_readme"

echo
echo "=== gha-tools repo root at v0.0.198 ==="
python - <<'PY'
import json, urllib.request
url = "https://api.github.com/repos/rapidsai/gha-tools/contents?ref=v0.0.198"
with urllib.request.urlopen(url) as resp:
    data = json.load(resp)
for item in data:
    print(item["name"])
PY

Repository: NVIDIA/numbast

Length of output: 6094


Package only the tools/ payload, not the whole gha-tools checkout.

GitHub artifacts do not preserve file permissions: single-artifact downloads extract directly to the requested path with all files set to 644 and directories to 755. With line 130 adding ${GITHUB_WORKSPACE}/tools to PATH, the current setup downloads the entire external repo into the workspace root, where rapidsai/gha-tools has top-level README.md, LICENSE, CONTRIBUTING.md, and other files that can overwrite the checked-out source tree. More critically, tools like rapids-find-anaconda-uploads.py and rapids-retry (invoked directly in ci/upload_conda.sh) lose their executable bit after the artifact round-trip, causing invocation failures.

🔧 Suggested fix
   fetch-gha-tools:
     runs-on: ubuntu-latest
     steps:
       - uses: actions/checkout@v6
         with:
           repository: rapidsai/gha-tools
           ref: v0.0.198
           path: gha-tools
           fetch-depth: 1
+      - name: Archive RAPIDS gha-tools
+        run: tar -C gha-tools -czf rapids-gha-tools-tools.tgz tools
       - uses: actions/upload-artifact@v7
         with:
           name: rapids-gha-tools
-          path: gha-tools
+          path: rapids-gha-tools-tools.tgz
           if-no-files-found: error
...
       - uses: actions/download-artifact@v6
         with:
           name: rapids-gha-tools
-          path: ${{ github.workspace }}
+          path: ${{ github.workspace }}/gha-tools
+      - name: Extract RAPIDS gha-tools
+        run: |
+          mkdir -p "${GITHUB_WORKSPACE}/gha-tools"
+          tar -xzf "${GITHUB_WORKSPACE}/gha-tools/rapids-gha-tools-tools.tgz" -C "${GITHUB_WORKSPACE}/gha-tools"
       - name: Add RAPIDS gha-tools to PATH
-        run: echo "${GITHUB_WORKSPACE}/tools" | tee -a "${GITHUB_PATH}"
+        run: echo "${GITHUB_WORKSPACE}/gha-tools/tools" | tee -a "${GITHUB_PATH}"
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In @.github/workflows/conda-python-build.yaml around lines 92 - 95, The workflow
currently uploads the entire gha-tools checkout artifact (name:
rapids-gha-tools, path: gha-tools), which when re-downloaded overwrites repo
files and strips execute permissions for scripts referenced on PATH; change the
upload to only package the tools payload (e.g., tools/ subdirectory) so
consumers add ${GITHUB_WORKSPACE}/tools to PATH safely and scripts like
rapids-find-anaconda-uploads.py and rapids-retry retain their executable
bits—update the actions/upload-artifact step to point its path to the tools
subdirectory and adjust any downstream download/restore steps accordingly.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants