Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
38 changes: 38 additions & 0 deletions images/runtime/ray/cpu/2.52.1-py311-cpu/Dockerfile
Original file line number Diff line number Diff line change
@@ -0,0 +1,38 @@
ARG PYTHON_VERSION=311
ARG IMAGE_TAG=9.7-1764607342

FROM registry.access.redhat.com/ubi9/python-${PYTHON_VERSION}:${IMAGE_TAG}

ARG CONSTRAINTS_FILE=requirements_compiled.txt

LABEL name="ray-ubi9-py311-cpu" \
summary="Python 3.11 image based on UBI9 for Ray (CPU only)" \
description="CPU-only Python 3.11 image based on UBI9 for Ray" \
io.k8s.display-name="Python 3.11 base image for Ray (CPU)" \
io.k8s.description="CPU-only Python 3.11 image based on UBI9 for Ray" \
authoritative-source-url="https://github.com/opendatahub-io/distributed-workloads"

USER 0
WORKDIR /opt/app-root/bin

# Update base image and install basic build tools
RUN yum upgrade -y && \
yum install -y \
openblas \
openblas-devel \
make \
findutils \
&& yum clean all && \
rm -rf /var/cache/yum/*

# Copy constraints file
COPY ${CONSTRAINTS_FILE} .

RUN pip install --no-cache-dir \
-c ${CONSTRAINTS_FILE} \
ray[default,data,train,tune,serve]==2.52.1 \
--extra-index-url https://wheels.developerfirst.ibm.com/ppc64le/linux/+simple/

# Restore user workspace
USER 1001
WORKDIR /opt/app-root/src
10 changes: 10 additions & 0 deletions images/runtime/ray/cpu/2.52.1-py311-cpu/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,10 @@
# Ray Runtime Container Image

CPU enabled container image for Ray in OpenShift AI.
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

Fix hyphenation in user-facing text on Line 3.

Use CPU-enabled instead of CPU enabled for correct compound adjective formatting.

🧰 Tools
🪛 LanguageTool

[grammar] ~3-~3: Use a hyphen to join words.
Context: # Ray Runtime Container Image CPU enabled container image for Ray in OpenS...

(QB_NEW_EN_HYPHEN)

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@images/runtime/ray/cpu/2.52.1-py311-cpu/README.md` at line 3, Update the
user-facing description text in README.md where it currently reads "CPU enabled
container image for Ray in OpenShift AI." — change "CPU enabled" to the
hyphenated compound "CPU-enabled" so the sentence reads "CPU-enabled container
image for Ray in OpenShift AI." Ensure the edited line in the README.md is the
only change.


It includes the following layers:

* UBI 9
* Python 3.11
* CPU (no GPU acceleration)
* Ray 2.52.1
222 changes: 222 additions & 0 deletions images/runtime/ray/cpu/2.52.1-py311-cpu/requirements_compiled.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1,222 @@
aiohappyeyeballs==2.6.1
# via aiohttp
aiohttp==3.13.3
# via
# aiohttp-cors
# ray
aiohttp-cors==0.8.1
# via ray
aiosignal==1.4.0
# via aiohttp
annotated-doc==0.0.4
# via fastapi
annotated-types==0.7.0
# via pydantic
anyio==4.12.1
# via
# starlette
# watchfiles
attrs==25.4.0
# via
# aiohttp
# jsonschema
# referencing
certifi==2026.2.25
# via requests
cffi==2.0.0
# via cryptography
charset-normalizer==3.4.4
# via requests
click==8.2.1
# via
# ray
# uvicorn
colorful==0.5.8
# via ray
cryptography==46.0.5
# via google-auth
distlib==0.4.0
# via virtualenv
fastapi==0.133.0
# via ray
filelock==3.24.3
# via
# ray
# virtualenv
frozenlist==1.8.0
# via
# aiohttp
# aiosignal
fsspec==2026.2.0
# via ray
google-api-core==2.30.0
# via opencensus
google-auth==2.48.0
# via google-api-core
googleapis-common-protos==1.72.0
# via google-api-core
grpcio==1.75.0
# via ray
h11==0.16.0
# via uvicorn
httptools==0.7.1
# via uvicorn
idna==3.11
# via
# anyio
# requests
# yarl
importlib-metadata==8.7.1
# via opentelemetry-api
jsonschema==4.26.0
# via ray
jsonschema-specifications==2025.9.1
# via jsonschema
msgpack==1.1.2
# via ray
multidict==6.7.1
# via
# aiohttp
# yarl
numpy==2.4.2
# via
# pandas
# ray
# tensorboardx
opencensus==0.11.4
# via ray
opencensus-context==0.1.3
# via opencensus
opentelemetry-api==1.39.1
# via
# opentelemetry-exporter-prometheus
# opentelemetry-sdk
# opentelemetry-semantic-conventions
opentelemetry-exporter-prometheus==0.60b1
# via ray
opentelemetry-proto==1.39.1
# via ray
opentelemetry-sdk==1.39.1
# via
# opentelemetry-exporter-prometheus
# ray
opentelemetry-semantic-conventions==0.60b1
# via opentelemetry-sdk
packaging==26.0
# via
# ray
# tensorboardx
pandas==3.0.1
# via ray
platformdirs==4.9.2
# via virtualenv
prometheus-client==0.24.1
# via
# opentelemetry-exporter-prometheus
# ray
propcache==0.4.1
# via
# aiohttp
# yarl
proto-plus==1.27.1
# via google-api-core
protobuf==6.33.5
# via
# google-api-core
# googleapis-common-protos
# opentelemetry-proto
# proto-plus
# ray
# tensorboardx
py-spy==0.4.1
# via ray
pyarrow==20.0.0
# via ray
pyasn1==0.6.2
# via
# pyasn1-modules
# rsa
pyasn1-modules==0.4.2
# via google-auth
pycparser==3.0
# via cffi
pydantic==2.12.5
# via
# fastapi
# ray
pydantic-core==2.41.5
# via pydantic
python-dateutil==2.9.0.post0
# via pandas
python-dotenv==1.2.1
# via uvicorn
pyyaml==6.0.3
# via
# ray
# uvicorn
ray==2.52.1
# via -r req.txt
referencing==0.37.0
# via
# jsonschema
# jsonschema-specifications
requests==2.32.5
# via
# google-api-core
# ray
rpds-py==0.30.0
# via
# jsonschema
# referencing
rsa==4.9.1
# via google-auth
six==1.17.0
# via
# opencensus
# python-dateutil
smart-open==7.5.1
# via ray
starlette==0.52.1
# via
# fastapi
# ray
tensorboardx==2.6.4
# via ray
typing-extensions==4.15.0
# via
# aiosignal
# anyio
# fastapi
# grpcio
# opentelemetry-api
# opentelemetry-sdk
# opentelemetry-semantic-conventions
# pydantic
# pydantic-core
# referencing
# starlette
# typing-inspection
typing-inspection==0.4.2
# via
# fastapi
# pydantic
urllib3==2.6.3
# via requests
uvicorn==0.41.0
# via ray
uvloop==0.22.1
# via uvicorn
virtualenv==20.39.0
# via ray
watchfiles==1.1.1
# via
# ray
# uvicorn
websockets==16.0
# via uvicorn
wrapt==2.1.1
# via smart-open
yarl==1.22.0
# via aiohttp
zipp==3.23.0
# via importlib-metadata
Comment on lines +1 to +222
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

🧩 Analysis chain

🏁 Script executed:

# First, check the file itself and understand its format
head -20 images/runtime/ray/cpu/2.52.1-py311-cpu/requirements_compiled.txt
wc -l images/runtime/ray/cpu/2.52.1-py311-cpu/requirements_compiled.txt

# Check if there are any .in files (pip-compile source)
find images -name "*.in" -type f | head -20

# Check if any other requirements files in the repo use hashes
find images -name "requirements*.txt" -type f | head -20

Repository: opendatahub-io/distributed-workloads

Length of output: 970


🏁 Script executed:

# Check the git history or comments to understand how this file is generated
head -50 images/runtime/ray/cpu/2.52.1-py311-cpu/requirements_compiled.txt | cat -n

# Look for pip-compile directives or generation comments
grep -i "pip-compile\|generated\|hash\|--generate-hashes" images/runtime/ray/cpu/2.52.1-py311-cpu/requirements_compiled.txt || echo "No pip-compile metadata found"

Repository: opendatahub-io/distributed-workloads

Length of output: 1296


🏁 Script executed:

# Check if there are any Dockerfile references that show how this file is used
find images -name "Dockerfile*" -type f | xargs grep -l "requirements" 2>/dev/null | head -5

Repository: opendatahub-io/distributed-workloads

Length of output: 397


🏁 Script executed:

# Look for any documentation about supply-chain security or hash verification
find . -name "*.md" -o -name "*.rst" | xargs grep -l "hash\|pip.*require" 2>/dev/null | head -10

Repository: opendatahub-io/distributed-workloads

Length of output: 117


🏁 Script executed:

cat -n images/runtime/ray/cpu/2.52.1-py311-cpu/Dockerfile | head -100

Repository: opendatahub-io/distributed-workloads

Length of output: 1502


🏁 Script executed:

# Search for pip install commands in the Dockerfile
grep -A 5 -B 5 "pip install.*requirements" images/runtime/ray/cpu/2.52.1-py311-cpu/Dockerfile

Repository: opendatahub-io/distributed-workloads

Length of output: 62


🏁 Script executed:

# Check if there's a .in file (source for pip-compile)
find images/runtime/ray/cpu/2.52.1-py311-cpu -name "*.in" -o -name "*.txt" | sort

Repository: opendatahub-io/distributed-workloads

Length of output: 145


🏁 Script executed:

# Check if pip-compile was used with --generate-hashes flag by looking for any build/generate scripts
find images -name "*.sh" -o -name "Makefile" -o -name "pyproject.toml" | xargs grep -l "pip-compile\|generate.*hash" 2>/dev/null || echo "No build scripts found"

# Check if there's a requirements.in file elsewhere or documentation
find . -name "*.in" -type f | head -20

Repository: opendatahub-io/distributed-workloads

Length of output: 220


🏁 Script executed:

# Verify pip's hash support with constraints vs requirements
# Let's check if there are any other constraint files in the repo
find . -name "*constraint*" -type f | head -10

Repository: opendatahub-io/distributed-workloads

Length of output: 62


🌐 Web query:

pip install --require-hashes constraint files CWE-494 artifact integrity verification

💡 Result:

The command pip install --require-hashes -r requirements.txt (or with constraint files via -c constraints.txt) enables pip's hash-checking mode to mitigate CWE-494 (Download of Code Without Integrity Check). This prevents installation of tampered packages by verifying cryptographic hashes (preferably SHA256) of all downloaded artifacts against pre-computed values listed in the file. Key requirements for success: - All packages and dependencies must be explicitly listed and pinned with == (exact versions). - Each must include at least one --hash=sha256:... (multiple hashes for different wheels/sdists). - Automatically activates if any hash is present; --require-hashes forces it. Example requirements.txt: FooProject==1.2 \ --hash=sha256:2cf24dba5fb0a30e26e83b2ac5b9e29e1b161e5c1fa7425e73043362938b9824 \ --hash=sha256:486ea46224d1bb4fb680f34f7c9ad96a8f24ec88be73ea8e5a6c65260e9cb8a7 Generate hashes: 1. Run pip install -r requirements.in --require-hashes (fails but outputs missing hashes). 2. Add them manually or use pip-tools (pip-compile --generate-hashes). With constraints: Hashes in constraints.txt are not used by pip for verification; generate fully hashed requirements.txt instead. Combine with --no-deps for project installs to avoid unhashed downloads. This protects against supply chain attacks, PyPI compromise, or transit tampering beyond TLS. Common pitfalls: - Unpinned deps or extras trigger "all requirements must have their versions pinned with ==". - Resolver issues (fixed in modern pip); use --no-deps if needed. - Network errors mimic hash mismatch; retry with --no-cache-dir. Official pip docs (v26.0.1, current as of 2026) confirm this workflow for secure, repeatable installs.

Citations:


Lockfile missing artifact hashes (CWE-494, High severity).

Exploit scenario: a malicious or compromised PyPI index can return a substituted artifact for a pinned version; install-time code runs during image build.

Remediation: regenerate with pip-compile --generate-hashes to add SHA256 hashes to each package. Then in the Dockerfile, change from -c ${CONSTRAINTS_FILE} to -r ${CONSTRAINTS_FILE} and add --require-hashes to the pip install command. Constraint files (-c) do not enforce hash verification; only requirements files (-r) support the --require-hashes flag.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@images/runtime/ray/cpu/2.52.1-py311-cpu/requirements_compiled.txt` around
lines 1 - 222, The lockfile requirements_compiled.txt lacks artifact hashes;
regenerate it with pip-compile --generate-hashes to produce SHA256 entries for
every package, replace the existing requirements_compiled.txt with the hashed
output, and then update the image build step that currently uses a constraints
install flag (-c ${CONSTRAINTS_FILE}) to use a requirements install (-r
${CONSTRAINTS_FILE}) and add the --require-hashes option to the pip install
command in the Dockerfile/build script so pip enforces the hashes at install
time.

38 changes: 38 additions & 0 deletions images/runtime/ray/cpu/2.52.1-py312-cpu/Dockerfile
Original file line number Diff line number Diff line change
@@ -0,0 +1,38 @@
ARG PYTHON_VERSION=312
ARG IMAGE_TAG=9.7-1764607356

FROM registry.access.redhat.com/ubi9/python-${PYTHON_VERSION}:${IMAGE_TAG}

ARG CONSTRAINTS_FILE=requirements_compiled.txt

LABEL name="ray-ubi9-py312-cpu" \
summary="Python 3.12 image based on UBI9 for Ray (CPU only)" \
description="CPU-only Python 3.12 image based on UBI9 for Ray" \
io.k8s.display-name="Python 3.12 base image for Ray (CPU)" \
io.k8s.description="CPU-only Python 3.12 image based on UBI9 for Ray" \
authoritative-source-url="https://github.com/opendatahub-io/distributed-workloads"

USER 0
WORKDIR /opt/app-root/bin

# Update base image and install basic build tools
RUN yum upgrade -y && \
yum install -y \
openblas \
openblas-devel \
make \
findutils \
&& yum clean all && \
rm -rf /var/cache/yum/*

# Copy constraints file
COPY ${CONSTRAINTS_FILE} .

RUN pip install --no-cache-dir \
-c ${CONSTRAINTS_FILE} \
ray[default,data,train,tune,serve]==2.52.1 \
--extra-index-url https://wheels.developerfirst.ibm.com/ppc64le/linux/+simple/
Comment on lines +31 to +34
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

🧩 Analysis chain

🏁 Script executed:

find . -type f -name "Dockerfile" | grep -E "ray.*cpu.*2\.52\.1" | head -20

Repository: opendatahub-io/distributed-workloads

Length of output: 185


🏁 Script executed:

cd images/runtime/ray/cpu/2.52.1-py312-cpu 2>/dev/null && pwd && ls -la

Repository: opendatahub-io/distributed-workloads

Length of output: 444


🏁 Script executed:

cat -n images/runtime/ray/cpu/2.52.1-py312-cpu/Dockerfile | head -50

Repository: opendatahub-io/distributed-workloads

Length of output: 1502


🏁 Script executed:

cat -n images/runtime/ray/cpu/2.52.1-py312-cpu/requirements_compiled.txt | head -30

Repository: opendatahub-io/distributed-workloads

Length of output: 811


🏁 Script executed:

wc -l images/runtime/ray/cpu/2.52.1-py312-cpu/requirements_compiled.txt

Repository: opendatahub-io/distributed-workloads

Length of output: 149


🏁 Script executed:

grep -E "^[a-zA-Z0-9].*==.*#" images/runtime/ray/cpu/2.52.1-py312-cpu/requirements_compiled.txt | head -5

Repository: opendatahub-io/distributed-workloads

Length of output: 62


🏁 Script executed:

grep -E "sha256|hash" images/runtime/ray/cpu/2.52.1-py312-cpu/requirements_compiled.txt | head -5

Repository: opendatahub-io/distributed-workloads

Length of output: 62


🏁 Script executed:

cat images/runtime/ray/cpu/2.52.1-py312-cpu/requirements_compiled.txt | head -40

Repository: opendatahub-io/distributed-workloads

Length of output: 765


🌐 Web query:

pip install -c constraints flag hash verification behavior

💡 Result:

When using pip install -c constraints.txt (or -c constraints_file), if the constraints file contains entries with --hash options (e.g., package==1.0 --hash=sha256:...), pip activates hash-checking mode automatically, just as it does for requirements files with hashes. This is because hash-checking is triggered by the presence of any --hash in the input files processed by pip. In this mode: - pip verifies downloaded packages against the specified hashes in the constraints. - All top-level requirements (from command line, -r files, etc.) must be pinned with == (exact version). - All dependencies resolved must also have matching entries with hashes in the constraints file (or requirements), as hash-checking is all-or-nothing: every installed package requires a hash. - Constraints files can provide version pins and hashes for dependencies without installing them directly. Example constraints.txt with hashes: numpy==1.21.0 \ --hash=sha256:... pandas==1.3.0 \ --hash=sha256:... \ --hash=sha256:... # multiple hashes for different wheels Command: pip install some-package -c constraints.txt - some-package must be == pinned if --require-hashes explicit or mode active. - Dependencies like numpy/pandas will use the versions/hashes from constraints.txt. - If a dependency lacks a hash entry, pip errors: "Hashes are required in --require-hashes mode, but they are missing..." This behavior is documented in pip's secure installs guide (hash-checking mode applies to requirements files, but extends to constraints as they are parsed similarly). Since pip 22.2 (PR #10962 merged), the new resolver properly filters candidates using hashes from constraints files. Earlier versions (pre-22.x new resolver) had bugs where hashes in constraints were ignored or mishandled, leading to errors like unresolved pins. Note: Constraints syntax supports --hash per official requirements file format (pip docs confirm --hash as per-requirement option). However, combining unpinned top-level reqs with hashed constraints often fails due to the pinning requirement in hash mode. Best practice: generate full pinned+hashed requirements.txt via pip-compile --generate-hashes for secure repeatable installs, use constraints for shared dep pinning across projects.

Citations:


🌐 Web query:

CWE-610 CWE-494 package integrity supply chain attacks

💡 Result:

CWE-610 and CWE-494 are weaknesses that enable package integrity issues in software supply chain attacks. CWE-610: Externally Controlled Reference to a Resource in Another Sphere is a class weakness where a product uses an externally controlled name or reference that resolves to a resource outside the intended control sphere. This can allow adversaries to access unintended resources, such as directing a package manager to fetch from a malicious repository via manipulated URLs or references. Consequences include reading/modifying data and privilege escalation. It is discouraged for direct vulnerability mapping, as it is abstract; more specific children like CWE-829 (Inclusion of Functionality from Untrusted Control Sphere) apply to supply chain scenarios. CWE-494: Download of Code Without Integrity Check is a base weakness where a product downloads and executes code from a remote location without verifying origin and integrity (e.g., no hashes, signatures). Attackers can compromise servers, spoof DNS, or alter code in transit, leading to unauthorized code execution. Examples include firmware updates without validation (CVE-2023-5984) and unverified auto-updates. Relation to Package Integrity and Supply Chain Attacks: These CWEs are key enablers in supply chain compromises (MITRE ATT&CK T1195), where attackers tamper with dependencies or updates. CWE-494 directly causes execution of tampered packages (e.g., malicious npm/PyPI uploads, dependency confusion CVE-2021-24105). CWE-610 facilitates pulling malicious packages via controlled references. OWASP links CWE-494 to supply chain risks (e.g., AST02, CICD-SEC-3), and broader contexts like OWASP Top 10 2025 A03: Software Supply Chain Failures emphasize such integrity failures. Mitigations: Verify signatures/hashes (avoid CWE-295/320), use trusted proxies/registries, SBOMs, and pinned versions.

Citations:


🏁 Script executed:

cat images/runtime/ray/cpu/2.52.1-py311-cpu/Dockerfile | grep -A 5 "pip install"

Repository: opendatahub-io/distributed-workloads

Length of output: 299


🏁 Script executed:

grep "requires-hashes\|require-hashes\|--hash" images/runtime/ray/cpu/2.52.1-py311-cpu/requirements_compiled.txt

Repository: opendatahub-io/distributed-workloads

Length of output: 62


🌐 Web query:

IBM Developer First ppc64le wheels source repository trustworthiness

💡 Result:

The IBM Developer First ppc64le wheels source repository at https://wheels.developerfirst.ibm.com/ppc64le/linux is a trustworthy, official IBM-hosted DevPi server providing pre-built, optimized Python wheels for the ppc64le (IBM Power) architecture. IBM's Power team builds these open-source packages natively on Power hardware, focusing on AI/ML and data science libraries (e.g., torch, vllm, numpy), supporting Python 3.10-3.12. Usage: pip install --prefer-binary --extra-index-url=https://wheels.developerfirst.ibm.com/ppc64le/linux. Supporting resources include IBM community blogs by developers like Nikhil Kalbande and Gerrit Huizenga (2025 posts), GitHub ppc64le/pyeco repo for demos/issues, and ppc64le/build-scripts for related open-source builds. No security issues specific to this repo found; wheels carry original OSS licenses (e.g., Apache-2.0, MIT per IBM license doc). As an official IBM initiative under "Developer First" for Power, with active maintenance and community collaboration, it is highly trustworthy for production use on ppc64le systems.

Citations:


Add hash verification to dependency resolution on lines 31-34 (CWE-494).

Exploit scenario: Version pinning in the constraints file provides no integrity protection; a compromised extra-index-url source or MITM attack can inject malicious wheels with matching versions.

Remediation: Generate a hash-locked requirements file using pip-compile --generate-hashes and use pip install --require-hashes -r requirements.txt to enforce cryptographic verification of all downloaded packages.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@images/runtime/ray/cpu/2.52.1-py312-cpu/Dockerfile` around lines 31 - 34, The
current Dockerfile RUN pip install using -c ${CONSTRAINTS_FILE} and an
extra-index-url lacks hash verification; generate a hash-locked requirements
file (e.g., run pip-compile --generate-hashes against your constraints and
explicit dependencies like ray[default,data,train,tune,serve]==2.52.1 and
include any extra-index-url configuration) and update the Dockerfile to COPY
that generated requirements.txt into the image and replace the pip install line
with pip install --require-hashes -r requirements.txt (removing the insecure -c
${CONSTRAINTS_FILE} usage), ensuring all packages are installed only if their
cryptographic hashes match.


# Restore user workspace
USER 1001
WORKDIR /opt/app-root/src
10 changes: 10 additions & 0 deletions images/runtime/ray/cpu/2.52.1-py312-cpu/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,10 @@
# Ray Runtime Container Image

CPU enabled container image for Ray in OpenShift AI.
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

Fix hyphenation in user-facing text on Line 3.

Use CPU-enabled instead of CPU enabled for correct compound adjective formatting.

🧰 Tools
🪛 LanguageTool

[grammar] ~3-~3: Use a hyphen to join words.
Context: # Ray Runtime Container Image CPU enabled container image for Ray in OpenS...

(QB_NEW_EN_HYPHEN)

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@images/runtime/ray/cpu/2.52.1-py312-cpu/README.md` at line 3, Update the
user-facing text in README.md by replacing the phrase "CPU enabled container
image for Ray in OpenShift AI." with "CPU-enabled container image for Ray in
OpenShift AI." so the compound adjective is hyphenated correctly; locate the
sentence in the README.md (the line containing "CPU enabled container image for
Ray in OpenShift AI.") and apply the hyphenation change.


It includes the following layers:

* UBI 9
* Python 3.12
* CPU (no GPU acceleration)
* Ray 2.52.1
Loading