Conversation
|
Note Reviews pausedIt looks like this branch is under active development. To avoid overwhelming you with review comments due to an influx of new commits, CodeRabbit has automatically paused this review. You can configure this behavior by changing the Use the following commands to manage reviews:
Use the checkboxes below for quick actions:
📝 WalkthroughWalkthroughThe CI narrows the build matrix to ARCH and PY_VER, adds a fetch-gha-tools job that publishes a rapids-gha-tools artifact, and runs matrix builds in a fixed condaforge/miniforge3 container. The build script bootstraps apt packages, renders .condarc, installs/configures sccache, and installs an ABI-aware Python into conda base. Changes
Sequence Diagram(s)sequenceDiagram
participant GH as "GitHub Actions"
participant Fetch as "fetch-gha-tools"
participant Artifact as "Artifact Storage"
participant Matrix as "build (matrix)"
participant Container as "miniforge container"
participant Bootstrap as "bootstrap script"
GH->>Fetch: run fetch-gha-tools job
Fetch->>Artifact: checkout [email protected]\nupload rapids-gha-tools artifact
GH->>Matrix: start matrix jobs (depend on fetch-gha-tools)
Matrix->>Artifact: download rapids-gha-tools artifact
Matrix->>Container: run in condaforge/miniforge3:26.1.1-3
Container->>Bootstrap: make tools executable\napt installs\nrender .condarc\ninstall/configure sccache\ninstall ABI-aware Python
Bootstrap->>Container: prepared conda base for conda-build
Container->>GH: run conda-build steps\nupload build artifacts
Estimated code review effort🎯 4 (Complex) | ⏱️ ~50 minutes Poem
🚥 Pre-merge checks | ✅ 3✅ Passed checks (3 passed)
✏️ Tip: You can configure your own custom pre-merge checks in the settings. ✨ Finishing Touches🧪 Generate unit tests (beta)
Comment |
|
4df0d3f to
239fdab
Compare
There was a problem hiding this comment.
Actionable comments posted: 3
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In @.github/workflows/conda-python-build.yaml:
- Around line 86-89: The checkout step that pulls rapidsai/gha-tools currently
uses an unstable branch ref ("ref: main"); update the actions/checkout
invocation that references repository: rapidsai/gha-tools to pin ref to an
immutable release tag or commit SHA (e.g., a specific tag like v0.0.198 or a
commit SHA) instead of "main" so the workflow reproducibly uses a fixed
gha-tools version.
- Around line 92-95: The workflow step that adds the tools directory to PATH is
pointing to the wrong location: replace the incorrect PATH entry
"${GITHUB_WORKSPACE}/tools" with the actual extracted location
"${GITHUB_WORKSPACE}/gha-tools/tools" in the step that modifies PATH (the step
that uses the PATH env or echo "::add-path::" pattern); ensure the same
correction is applied to the corresponding lines around the artifact extraction
(the step that downloads/extracts gha-tools) so all references consistently use
"${GITHUB_WORKSPACE}/gha-tools/tools".
In `@ci/build_conda_python.sh`:
- Around line 12-13: The curl command fetching the `.condarc` template from
rapidsai/ci-imgs/main is using the mutable main branch; update the script so the
raw URL is pinned to an immutable commit SHA (or replace the remote fetch by
reading a vendored local copy of the template) before running envsubst and tee.
Locate the curl pipeline in ci/build_conda_python.sh (the line using curl -fsSL
"https://raw.githubusercontent.com/rapidsai/ci-imgs/main/context/condarc.tmpl" |
envsubst | tee ~/.condarc) and change it to use a specific commit SHA in the
raw.githubusercontent.com URL or switch to reading a checked-in condarc.tmpl
file and then run envsubst | tee ~/.condarc.
🪄 Autofix (Beta)
Fix all unresolved CodeRabbit comments on this PR:
- Push a commit to this branch (recommended)
- Create a new PR with the fixes
ℹ️ Review info
⚙️ Run configuration
Configuration used: Path: .coderabbit.yaml
Review profile: ASSERTIVE
Plan: Pro
Run ID: e5fca8e8-33a4-4d1a-bcc6-164c05198109
📒 Files selected for processing (2)
.github/workflows/conda-python-build.yamlci/build_conda_python.sh
| - uses: actions/upload-artifact@v6 | ||
| with: | ||
| name: rapids-gha-tools | ||
| path: gha-tools |
There was a problem hiding this comment.
🧩 Analysis chain
🏁 Script executed:
# First, let's check the actual workflow file
cat -n .github/workflows/conda-python-build.yaml | sed -n '85,135p'Repository: NVIDIA/numbast
Length of output: 2325
🏁 Script executed:
# Check if pyproject.toml exists in the repo root
ls -la pyproject.toml 2>/dev/null || echo "No pyproject.toml in repo root"Repository: NVIDIA/numbast
Length of output: 120
🏁 Script executed:
# Look for conda recipes that read pyproject.toml
find . -name "meta.yaml" -type f 2>/dev/null | head -20Repository: NVIDIA/numbast
Length of output: 173
🏁 Script executed:
# Check the fetch-gha-tools job to understand what's being uploaded
cat -n .github/workflows/conda-python-build.yaml | grep -A 30 "fetch-gha-tools"Repository: NVIDIA/numbast
Length of output: 3237
🌐 Web query:
GitHub Actions upload-artifact v6 download-artifact path extraction semantics
💡 Result:
In GitHub Actions, upload-artifact@v6 (released Dec 2025, Node.js 24 runtime) and download-artifact@v6 create and handle artifacts as immutable ZIP archives containing the specified files/directories with preserved relative path hierarchy from upload. Key semantics for path extraction on download: - Artifacts are ZIP archives of matched files/paths. - For upload-artifact@v6: - Single path: Full relative structure preserved. - Wildcards: Hierarchy preserved after first wildcard. - Multiple paths: Least common ancestor becomes ZIP root; excludes don't affect structure. - For download-artifact@v6: - Single artifact (by name or ID): Contents extracted directly to specified path/ (no extra artifact-named subdir). - Multiple artifacts (no name, or pattern matching >1): Each extracts to path// unless merge-multiple: true (merges to path/). - Default path: $GITHUB_WORKSPACE. - Supports ~ expansion. In v5 of download-artifact, single-ID downloads nested under artifact name (fixed in v5+ for consistency). Permissions: ZIP strips executables (dirs 755, files 644); use tar to preserve. Examples: # Upload preserves structure - uses: actions/upload-artifact@v6 with: name: build path: | dist/ !dist/*.tmp # Download single: extracts dist/ directly to path/ - uses: actions/download-artifact@v6 with: name: build path: ./output # results in ./output/dist/... # Download all: extracts to path// - uses: actions/download-artifact@v6 with: path: ./output # ./output/build/dist/... merge-multiple: true # merges all to ./output/dist/... Sources confirm v6 maintains v4+ behaviors with Node 24 updates; no path semantic changes from v4-v6.
Citations:
- 1: https://github.com/actions/upload-artifact
- 2: https://github.com/marketplace/actions/upload-a-build-artifact
- 3: actions/download-artifact@v5...v6
- 4: https://github.com/actions/download-artifact
- 5: https://www.github.com/actions/upload-artifact
- 6: https://github.com/actions/upload-artifact/blob/main/docs/MIGRATION.md
- 7: https://linkedin.com/in/abdel-karim-labane-387208181
- 8: https://linkedin.com/in/sohaib-khan-0a95692a4
- 9: https://linkedin.com/in/ericsciple
- 10: https://github.com/actions/download-artifact/blob/d3f86a10/src/download-artifact.ts
Fix the PATH reference to point to the correct location of the tools directory.
The artifact download at lines 123-126 extracts the gha-tools repository contents to a subdirectory within the workspace. However, line 128 attempts to add ${GITHUB_WORKSPACE}/tools to PATH, when the tools are actually located at ${GITHUB_WORKSPACE}/gha-tools/tools. This will cause the build script to fail when trying to locate the tools.
Proposed fix
- uses: actions/download-artifact@v6
with:
name: rapids-gha-tools
path: ${{ github.workspace }}
- name: Add RAPIDS gha-tools to PATH
- run: echo "${GITHUB_WORKSPACE}/tools" | tee -a "${GITHUB_PATH}"
+ run: echo "${GITHUB_WORKSPACE}/gha-tools/tools" | tee -a "${GITHUB_PATH}"Also applies to: 123-128
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In @.github/workflows/conda-python-build.yaml around lines 92 - 95, The workflow
step that adds the tools directory to PATH is pointing to the wrong location:
replace the incorrect PATH entry "${GITHUB_WORKSPACE}/tools" with the actual
extracted location "${GITHUB_WORKSPACE}/gha-tools/tools" in the step that
modifies PATH (the step that uses the PATH env or echo "::add-path::" pattern);
ensure the same correction is applied to the corresponding lines around the
artifact extraction (the step that downloads/extracts gha-tools) so all
references consistently use "${GITHUB_WORKSPACE}/gha-tools/tools".
There was a problem hiding this comment.
Actionable comments posted: 3
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In @.github/workflows/conda-python-build.yaml:
- Line 108: The workflow currently uses an unpinned base image via the image
field "condaforge/miniforge3:latest"; replace that with a specific, immutable
reference (either a released version tag like "condaforge/miniforge3:23.07-1" or
a SHA256 digest) so the CI run is reproducible—update the image value where
"image: condaforge/miniforge3:latest" appears to the chosen tag or digest and
run the workflow to verify the runner uses the pinned image.
In `@ci/build_conda_python.sh`:
- Around line 26-31: The conditional using string comparison (if [[
"$PYTHON_VERSION_PADDED" > "3.12" ]]) is fragile; change it to perform numeric
comparison by comparing integer major/minor parts: parse PYTHON_MAJOR_VERSION
and PYTHON_MINOR_VERSION (or split PYTHON_VERSION_PADDED) and compare
PYTHON_MAJOR_VERSION first then PYTHON_MINOR_VERSION against 3 and 12
respectively, then set PYTHON_ABI_TAG as before
(PYTHON_ABI_TAG="cp${PYTHON_MAJOR_VERSION}${PYTHON_MINOR_VERSION}" else
"cpython"); ensure the install line using PYTHON_ABI_TAG remains unchanged.
- Line 31: The conda MatchSpec in the rapids-mamba-retry install line is
malformed; update the package spec used in the command that references
PYTHON_VERSION, PYTHON_UPPER_BOUND and PYTHON_ABI_TAG to use valid conda syntax
— either separate the build string from the version range (e.g.,
"python>=${PYTHON_VERSION},<${PYTHON_UPPER_BOUND} *_${PYTHON_ABI_TAG}") or use
bracket notation with explicit version and build fields (e.g.,
"python[version='>=${PYTHON_VERSION},<${PYTHON_UPPER_BOUND}',build='*_${PYTHON_ABI_TAG}']"),
and replace the current
`"python>=${PYTHON_VERSION},<${PYTHON_UPPER_BOUND}=*_${PYTHON_ABI_TAG}"` token
with one of these valid forms in the rapids-mamba-retry install invocation.
🪄 Autofix (Beta)
Fix all unresolved CodeRabbit comments on this PR:
- Push a commit to this branch (recommended)
- Create a new PR with the fixes
ℹ️ Review info
⚙️ Run configuration
Configuration used: Path: .coderabbit.yaml
Review profile: ASSERTIVE
Plan: Pro
Run ID: 542e83c2-38f3-432d-9cdd-1c3e426cd0d2
📒 Files selected for processing (2)
.github/workflows/conda-python-build.yamlci/build_conda_python.sh
| if [[ "$PYTHON_VERSION_PADDED" > "3.12" ]]; then | ||
| PYTHON_ABI_TAG="cp${PYTHON_MAJOR_VERSION}${PYTHON_MINOR_VERSION}" | ||
| else | ||
| PYTHON_ABI_TAG="cpython" | ||
| fi | ||
| rapids-mamba-retry install -y -n base "python>=${PYTHON_VERSION},<${PYTHON_UPPER_BOUND}=*_${PYTHON_ABI_TAG}" |
There was a problem hiding this comment.
Bash string comparison for version numbers is fragile.
The > operator in [[ ... ]] performs lexicographic (string) comparison, not numeric comparison. While "3.12" > "3.13" happens to work correctly, this pattern will break for edge cases like comparing "3.9" (which would incorrectly sort after "3.12" lexicographically since "9" > "1").
Given that the matrix includes versions 3.10–3.14, this currently works, but the logic is error-prone if 3.9 or earlier is ever added.
🔧 Proposed fix using numeric comparison
-PYTHON_MINOR_PADDED=$(printf "%02d" "$PYTHON_MINOR_VERSION")
-PYTHON_VERSION_PADDED="${PYTHON_MAJOR_VERSION}.${PYTHON_MINOR_PADDED}"
-if [[ "$PYTHON_VERSION_PADDED" > "3.12" ]]; then
+if [[ "$PYTHON_MAJOR_VERSION" -gt 3 ]] || { [[ "$PYTHON_MAJOR_VERSION" -eq 3 ]] && [[ "$PYTHON_MINOR_VERSION" -gt 12 ]]; }; then
PYTHON_ABI_TAG="cp${PYTHON_MAJOR_VERSION}${PYTHON_MINOR_VERSION}"
else
PYTHON_ABI_TAG="cpython"
fi🧰 Tools
🪛 Shellcheck (0.11.0)
[error] 26-26: Decimals are not supported. Either use integers only, or use bc or awk to compare.
(SC2072)
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@ci/build_conda_python.sh` around lines 26 - 31, The conditional using string
comparison (if [[ "$PYTHON_VERSION_PADDED" > "3.12" ]]) is fragile; change it to
perform numeric comparison by comparing integer major/minor parts: parse
PYTHON_MAJOR_VERSION and PYTHON_MINOR_VERSION (or split PYTHON_VERSION_PADDED)
and compare PYTHON_MAJOR_VERSION first then PYTHON_MINOR_VERSION against 3 and
12 respectively, then set PYTHON_ABI_TAG as before
(PYTHON_ABI_TAG="cp${PYTHON_MAJOR_VERSION}${PYTHON_MINOR_VERSION}" else
"cpython"); ensure the install line using PYTHON_ABI_TAG remains unchanged.
| else | ||
| PYTHON_ABI_TAG="cpython" | ||
| fi | ||
| rapids-mamba-retry install -y -n base "python>=${PYTHON_VERSION},<${PYTHON_UPPER_BOUND}=*_${PYTHON_ABI_TAG}" |
There was a problem hiding this comment.
🧩 Analysis chain
🌐 Web query:
conda install version constraint syntax with build string selector
💡 Result:
The syntax for conda install with version constraint and build string selector uses the MatchSpec format: conda install package==version=build_string for exact matches, or variations like package=version=build_string (single = before build in some contexts). For patterns, use wildcards like package=version=py3* or bracket notation package[build=py3*]. Examples: - Exact: conda install numpy==1.23.1=py39hc836922_0 - Version with build pattern: conda install numpy=1.23.1=py39* - Build selector: conda install numpy[build=py39*] - Exclude build: conda install numpy[build!=hc292b87_0] This works in conda install, conda search, and environment files. Use quotes for complex specs with spaces or operators: conda install "numpy>=1.23=py39*".
Citations:
- 1: https://stackoverflow.com/questions/73226853/conda-search-specify-version-and-build-string
- 2: https://docs.conda.io/projects/conda/en/stable/dev-guide/api/conda/models/match_spec/index.html
- 3: https://stackoverflow.com/questions/57538225/how-to-use-the-conda-matchspec
- 4: https://github.com/conda/conda/blob/main/conda/models/match_spec.py
- 5: https://conda.org/learn/ceps/cep-0029
- 6: https://deepwiki.com/conda/conda/5.2-package-specifications
- 7: https://stackoverflow.com/questions/38411942/install-a-specific-anaconda-package-version
- 8: https://stackoverflow.com/questions/48128029/installing-specific-build-of-an-anaconda-package
🏁 Script executed:
cd ci && wc -l build_conda_python.shRepository: NVIDIA/numbast
Length of output: 82
🏁 Script executed:
cd ci && sed -n '25,35p' build_conda_python.shRepository: NVIDIA/numbast
Length of output: 453
🏁 Script executed:
rg "rapids-mamba-retry install" -A 2 -B 2Repository: NVIDIA/numbast
Length of output: 356
🌐 Web query:
conda MatchSpec syntax multiple version constraints build string >= < operators
💡 Result:
Conda MatchSpec syntax supports multiple version constraints using operators like >=, <, <=, >, ==, !=, combined with logical operators , (AND) and | (OR). Build strings can be specified exactly or with wildcards/patterns. Key syntax elements: - Positional form: name version build - Examples: - numpy==1.21.0=py39h1234567_0 (exact version and build) - foo 1.0 py27_0 → foo==1.0=py27_0 - >=1.0 , < 2.0 py34_0 → version '>=1.0,<2.0' with build py34_0 - Bracket notation for explicit constraints: - conda-forge::foo[version='>=1.0'] - numpy[version='>=1.7,<2.0', build=py3*] - Version specifiers support: - Single: >=1.0, <2.0, ==1.2.3, 1.2.* (fuzzy) - Multiple: >=1.0,<2.0 (AND), >1.8,<2|==1.7 (AND then OR) - Note: Spaces around operators are ignored or normalized; no parentheses for grouping (may cause parsing issues). - Build string: Exact like =py27_0 or patterns like py2*, specified after version with = or in brackets [build=py2*]. Canonical form: (channel(/subdir):)name(version(build))[key=value,...] Usage examples: - conda install "numpy>=1.20,<1.22 py3*" - conda search conda-forge::python[version='>=3.9',build='h*'] This is based on conda's official MatchSpec documentation (conda 26.1.1) and source code, which confirm parsing of complex version+build specs like ">=1.0 , < 2.0 py34_0" and bracket syntax for operators.
Citations:
- 1: https://docs.conda.io/projects/conda/en/stable/dev-guide/api/conda/models/match_spec/index.html
- 2: https://deepwiki.com/conda/conda/5.2-package-specifications
- 3: https://github.com/conda/conda/blob/main/conda/models/match_spec.py
- 4: https://stackoverflow.com/questions/57538225/how-to-use-the-conda-matchspec
🏁 Script executed:
# Search for similar patterns in the codebase
rg "python>=" --type sh -A 1 -B 1Repository: NVIDIA/numbast
Length of output: 245
Fix the conda version constraint syntax to properly specify the build string.
The version constraint "python>=${PYTHON_VERSION},<${PYTHON_UPPER_BOUND}=*_${PYTHON_ABI_TAG}" does not follow standard conda MatchSpec format. The build pattern =*_${PYTHON_ABI_TAG} is ambiguously attached only to the upper bound. Use standard syntax instead:
"python>=${PYTHON_VERSION},<${PYTHON_UPPER_BOUND} *_${PYTHON_ABI_TAG}"
Or use bracket notation for explicit constraints:
"python[version='${PYTHON_VERSION},<${PYTHON_UPPER_BOUND}', build='*_${PYTHON_ABI_TAG}']"
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@ci/build_conda_python.sh` at line 31, The conda MatchSpec in the
rapids-mamba-retry install line is malformed; update the package spec used in
the command that references PYTHON_VERSION, PYTHON_UPPER_BOUND and
PYTHON_ABI_TAG to use valid conda syntax — either separate the build string from
the version range (e.g., "python>=${PYTHON_VERSION},<${PYTHON_UPPER_BOUND}
*_${PYTHON_ABI_TAG}") or use bracket notation with explicit version and build
fields (e.g.,
"python[version='>=${PYTHON_VERSION},<${PYTHON_UPPER_BOUND}',build='*_${PYTHON_ABI_TAG}']"),
and replace the current
`"python>=${PYTHON_VERSION},<${PYTHON_UPPER_BOUND}=*_${PYTHON_ABI_TAG}"` token
with one of these valid forms in the rapids-mamba-retry install invocation.
There was a problem hiding this comment.
Actionable comments posted: 1
♻️ Duplicate comments (1)
.github/workflows/conda-python-build.yaml (1)
92-95:⚠️ Potential issue | 🟠 MajorArtifact only the
tools/subtree here.Line 95 currently uploads the
gha-toolsrepo root, butci/build_conda_python.sh:7and Line 128 both consume the helpers fromtools/at workspace root. With the artifact rooted atgha-tools, the downloaded layout no longer matches those paths, so the build job won't find therapids-*scripts. Uploadgha-tools/toolsinstead, or update every downstream reference consistently.🛠️ Proposed fix
- uses: actions/upload-artifact@v6 with: name: rapids-gha-tools - path: gha-tools + path: gha-tools/tools if-no-files-found: errorGitHub Actions official docs: when `actions/upload-artifact` uploads a directory path like `gha-tools`, does `actions/download-artifact` restore that top-level directory under the target path or flatten its contents? Please include official docs/examples.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In @.github/workflows/conda-python-build.yaml around lines 92 - 95, The artifact upload currently publishes the repo root "gha-tools" which breaks consumers expecting the tools/ subtree; change the upload action in the workflow (the actions/upload-artifact step that sets name: rapids-gha-tools and path: gha-tools) to upload the tools directory instead (path: gha-tools/tools) so downstream consumers such as ci/build_conda_python.sh (referenced at ci/build_conda_python.sh:7) and the job step that reads line 128 will find the expected rapids-* scripts, or alternatively update all downstream references to point into the restored gha-tools/<something> layout consistently.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In @.github/workflows/conda-python-build.yaml:
- Around line 151-152: The RAPIDS_CONDA_BLD_OUTPUT_DIR env var is currently set
only inside the build step so ci/upload_conda.sh (invoked when
inputs.upload_to_conda is true) cannot see it during the publish step; fix by
promoting RAPIDS_CONDA_BLD_OUTPUT_DIR to the job-level environment
(jobs.build.env) so both the build and publish steps can access it, or
alternatively add the same RAPIDS_CONDA_BLD_OUTPUT_DIR entry to the publish
step's env to ensure ci/upload_conda.sh:7 can read the output directory during
upload.
---
Duplicate comments:
In @.github/workflows/conda-python-build.yaml:
- Around line 92-95: The artifact upload currently publishes the repo root
"gha-tools" which breaks consumers expecting the tools/ subtree; change the
upload action in the workflow (the actions/upload-artifact step that sets name:
rapids-gha-tools and path: gha-tools) to upload the tools directory instead
(path: gha-tools/tools) so downstream consumers such as ci/build_conda_python.sh
(referenced at ci/build_conda_python.sh:7) and the job step that reads line 128
will find the expected rapids-* scripts, or alternatively update all downstream
references to point into the restored gha-tools/<something> layout consistently.
🪄 Autofix (Beta)
Fix all unresolved CodeRabbit comments on this PR:
- Push a commit to this branch (recommended)
- Create a new PR with the fixes
ℹ️ Review info
⚙️ Run configuration
Configuration used: Path: .coderabbit.yaml
Review profile: ASSERTIVE
Plan: Pro
Run ID: 829daadd-e120-4070-9545-527c6bbcd67f
📒 Files selected for processing (1)
.github/workflows/conda-python-build.yaml
There was a problem hiding this comment.
Actionable comments posted: 1
Caution
Some comments are outside the diff and can’t be posted inline due to platform limitations.
⚠️ Outside diff range comments (1)
.github/workflows/conda-python-build.yaml (1)
139-141: 🧹 Nitpick | 🔵 TrivialConsider pinning
setup-proxy-cacheto a specific ref.
nv-gha-runners/setup-proxy-cache@mainuses an unpinned branch ref, inconsistent with the pinning strategy applied togha-tools. Whilecontinue-on-error: truemitigates failure risk, pinning improves reproducibility.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In @.github/workflows/conda-python-build.yaml around lines 139 - 141, The workflow step "Setup proxy cache" currently uses an unpinned ref nv-gha-runners/setup-proxy-cache@main; change the uses value to a specific tag or commit SHA (for example nv-gha-runners/[email protected] or @<commit-sha>) to pin the action, keeping the step name "Setup proxy cache" and the existing continue-on-error: true; update any documentation or dependabot config if you choose to use a tag versus a SHA and ensure the chosen ref is tested in CI before merging.
♻️ Duplicate comments (1)
.github/workflows/conda-python-build.yaml (1)
124-132:⚠️ Potential issue | 🔴 CriticalPATH reference points to wrong location.
The artifact is uploaded from
gha-tools/(line 95), preserving directory structure. When downloaded to${{ github.workspace }}, the tools directory is at${GITHUB_WORKSPACE}/gha-tools/tools/, not${GITHUB_WORKSPACE}/tools/. The build will fail becauserapids-*scripts won't be found.Proposed fix
- name: Add RAPIDS gha-tools to PATH - run: echo "${GITHUB_WORKSPACE}/tools" | tee -a "${GITHUB_PATH}" + run: echo "${GITHUB_WORKSPACE}/gha-tools/tools" | tee -a "${GITHUB_PATH}"🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In @.github/workflows/conda-python-build.yaml around lines 124 - 132, The PATH step points to the wrong tools directory—after downloading the artifact with actions/download-artifact (name: rapids-gha-tools) into ${{ github.workspace }}, the tools live under gha-tools/tools, not tools; update the "Add RAPIDS gha-tools to PATH" step to append "${GITHUB_WORKSPACE}/gha-tools/tools" (instead of "${GITHUB_WORKSPACE}/tools") to ${GITHUB_PATH} so the rapids-* scripts are discoverable.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In @.github/workflows/conda-python-build.yaml:
- Around line 157-162: The workflow uses two different upload-artifact versions;
update the earlier occurrence that reads "uses: actions/upload-artifact@v6" so
it matches the later "uses: actions/upload-artifact@v7" (or vice versa if you
prefer v6) so both upload steps use the same action version; locate both lines
containing "uses: actions/upload-artifact@v6" and "uses:
actions/upload-artifact@v7" and make them identical to ensure consistent
behavior.
---
Outside diff comments:
In @.github/workflows/conda-python-build.yaml:
- Around line 139-141: The workflow step "Setup proxy cache" currently uses an
unpinned ref nv-gha-runners/setup-proxy-cache@main; change the uses value to a
specific tag or commit SHA (for example nv-gha-runners/[email protected]
or @<commit-sha>) to pin the action, keeping the step name "Setup proxy cache"
and the existing continue-on-error: true; update any documentation or dependabot
config if you choose to use a tag versus a SHA and ensure the chosen ref is
tested in CI before merging.
---
Duplicate comments:
In @.github/workflows/conda-python-build.yaml:
- Around line 124-132: The PATH step points to the wrong tools directory—after
downloading the artifact with actions/download-artifact (name: rapids-gha-tools)
into ${{ github.workspace }}, the tools live under gha-tools/tools, not tools;
update the "Add RAPIDS gha-tools to PATH" step to append
"${GITHUB_WORKSPACE}/gha-tools/tools" (instead of "${GITHUB_WORKSPACE}/tools")
to ${GITHUB_PATH} so the rapids-* scripts are discoverable.
🪄 Autofix (Beta)
Fix all unresolved CodeRabbit comments on this PR:
- Push a commit to this branch (recommended)
- Create a new PR with the fixes
ℹ️ Review info
⚙️ Run configuration
Configuration used: Path: .coderabbit.yaml
Review profile: ASSERTIVE
Plan: Pro
Run ID: e5b8a9a9-162f-413c-9c21-17a374ee7d71
📒 Files selected for processing (1)
.github/workflows/conda-python-build.yaml
There was a problem hiding this comment.
Actionable comments posted: 1
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In @.github/workflows/conda-python-build.yaml:
- Around line 92-95: The workflow currently uploads the entire gha-tools
checkout artifact (name: rapids-gha-tools, path: gha-tools), which when
re-downloaded overwrites repo files and strips execute permissions for scripts
referenced on PATH; change the upload to only package the tools payload (e.g.,
tools/ subdirectory) so consumers add ${GITHUB_WORKSPACE}/tools to PATH safely
and scripts like rapids-find-anaconda-uploads.py and rapids-retry retain their
executable bits—update the actions/upload-artifact step to point its path to the
tools subdirectory and adjust any downstream download/restore steps accordingly.
🪄 Autofix (Beta)
Fix all unresolved CodeRabbit comments on this PR:
- Push a commit to this branch (recommended)
- Create a new PR with the fixes
ℹ️ Review info
⚙️ Run configuration
Configuration used: Path: .coderabbit.yaml
Review profile: ASSERTIVE
Plan: Pro
Run ID: a67c01a0-8dce-4794-82ec-fac19792721b
📒 Files selected for processing (1)
.github/workflows/conda-python-build.yaml
| - uses: actions/upload-artifact@v7 | ||
| with: | ||
| name: rapids-gha-tools | ||
| path: gha-tools |
There was a problem hiding this comment.
🧩 Analysis chain
🏁 Script executed:
#!/bin/bash
set -euo pipefail
echo "=== workflow excerpt ==="
sed -n '86,132p' .github/workflows/conda-python-build.yaml
echo
echo "=== direct RAPIDS tool invocations ==="
sed -n '1,14p' ci/upload_conda.sh
upload_doc=$(mktemp)
download_doc=$(mktemp)
gha_readme=$(mktemp)
trap 'rm -f "$upload_doc" "$download_doc" "$gha_readme"' EXIT
curl -fsSL https://raw.githubusercontent.com/actions/upload-artifact/main/README.md > "$upload_doc"
curl -fsSL https://raw.githubusercontent.com/actions/download-artifact/main/README.md > "$download_doc"
curl -fsSL https://raw.githubusercontent.com/rapidsai/gha-tools/v0.0.198/README.md > "$gha_readme"
echo
echo "=== upload-artifact permission semantics ==="
grep -nE 'Permission Loss|File permissions are not maintained|tar all of your files together' "$upload_doc"
echo
echo "=== download-artifact extraction semantics ==="
grep -nE 'Destination path|single artifact .*directly to the specified path|individual named directories' "$download_doc"
echo
echo "=== upstream gha-tools install guidance ==="
grep -nE 'git clone .*gha-tools|tools.tar.gz|GITHUB_PATH' "$gha_readme"
echo
echo "=== gha-tools repo root at v0.0.198 ==="
python - <<'PY'
import json, urllib.request
url = "https://api.github.com/repos/rapidsai/gha-tools/contents?ref=v0.0.198"
with urllib.request.urlopen(url) as resp:
data = json.load(resp)
for item in data:
print(item["name"])
PYRepository: NVIDIA/numbast
Length of output: 6094
Package only the tools/ payload, not the whole gha-tools checkout.
GitHub artifacts do not preserve file permissions: single-artifact downloads extract directly to the requested path with all files set to 644 and directories to 755. With line 130 adding ${GITHUB_WORKSPACE}/tools to PATH, the current setup downloads the entire external repo into the workspace root, where rapidsai/gha-tools has top-level README.md, LICENSE, CONTRIBUTING.md, and other files that can overwrite the checked-out source tree. More critically, tools like rapids-find-anaconda-uploads.py and rapids-retry (invoked directly in ci/upload_conda.sh) lose their executable bit after the artifact round-trip, causing invocation failures.
🔧 Suggested fix
fetch-gha-tools:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v6
with:
repository: rapidsai/gha-tools
ref: v0.0.198
path: gha-tools
fetch-depth: 1
+ - name: Archive RAPIDS gha-tools
+ run: tar -C gha-tools -czf rapids-gha-tools-tools.tgz tools
- uses: actions/upload-artifact@v7
with:
name: rapids-gha-tools
- path: gha-tools
+ path: rapids-gha-tools-tools.tgz
if-no-files-found: error
...
- uses: actions/download-artifact@v6
with:
name: rapids-gha-tools
- path: ${{ github.workspace }}
+ path: ${{ github.workspace }}/gha-tools
+ - name: Extract RAPIDS gha-tools
+ run: |
+ mkdir -p "${GITHUB_WORKSPACE}/gha-tools"
+ tar -xzf "${GITHUB_WORKSPACE}/gha-tools/rapids-gha-tools-tools.tgz" -C "${GITHUB_WORKSPACE}/gha-tools"
- name: Add RAPIDS gha-tools to PATH
- run: echo "${GITHUB_WORKSPACE}/tools" | tee -a "${GITHUB_PATH}"
+ run: echo "${GITHUB_WORKSPACE}/gha-tools/tools" | tee -a "${GITHUB_PATH}"🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In @.github/workflows/conda-python-build.yaml around lines 92 - 95, The workflow
currently uploads the entire gha-tools checkout artifact (name:
rapids-gha-tools, path: gha-tools), which when re-downloaded overwrites repo
files and strips execute permissions for scripts referenced on PATH; change the
upload to only package the tools payload (e.g., tools/ subdirectory) so
consumers add ${GITHUB_WORKSPACE}/tools to PATH safely and scripts like
rapids-find-anaconda-uploads.py and rapids-retry retain their executable
bits—update the actions/upload-artifact step to point its path to the tools
subdirectory and adjust any downstream download/restore steps accordingly.
This changes switches from rapidsai container image to the smaller miniconda image to remove potential dependency on the tools not needed for the build.
Summary by CodeRabbit