Skip to content

Commit d83a193

Browse files
authored
Merge branch 'main' into update_agents
2 parents 6a01154 + a84ef19 commit d83a193

File tree

207 files changed

+1500
-297
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

207 files changed

+1500
-297
lines changed

.github/workflows/scripts/pr_workflow.py

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,3 +1,4 @@
1+
import logging
12
import os
23
import re
34
import sys
@@ -23,9 +24,8 @@
2324
from github.PullRequest import PullRequest
2425
from github.Repository import Repository
2526
from github.Team import Team
26-
from simple_logger.logger import get_logger
2727

28-
LOGGER = get_logger(name="pr_labeler")
28+
LOGGER = logging.getLogger("pr_labeler")
2929

3030

3131
class PrBaseClass:

CONSTITUTION.md

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -79,6 +79,7 @@ All code MUST consider security implications.
7979
- Avoid running destructive commands without explicit user confirmation
8080
- Use detect-secrets and gitleaks pre-commit hooks to prevent secret leakage
8181
- Test code MUST NOT introduce vulnerabilities into the tested systems
82+
- Use `utilities.path_utils.resolve_repo_path` to resolve and validate any user-supplied or parameterized file paths, preventing path-traversal and symlink-escape outside the repository root
8283
- JIRA ticket links are allowed in PRs and commit messages (our Jira is public)
8384
- Do NOT reference internal-only resources (Jenkins, Confluence, Slack threads) in code, PRs, or commit messages
8485
- Do NOT link embargoed or security-restricted (RH-employee-only) tickets

pyproject.toml

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -6,7 +6,7 @@ output-format = "grouped"
66
extend-exclude = ["utilities/manifests"]
77

88
[tool.ruff.lint]
9-
external = ["E501"]
9+
external = ["E501", "FCN001"]
1010

1111
[tool.ruff.format]
1212
exclude = [".git", ".venv", ".mypy_cache", ".tox", "__pycache__", "utilities/manifests"]
@@ -50,7 +50,7 @@ dependencies = [
5050
"openshift-python-utilities>=5.0.71",
5151
"pytest-dependency>=0.6.0",
5252
"pytest-progress",
53-
"python-simple-logger",
53+
"structlog>=24.1.0",
5454
"pyyaml",
5555
"tenacity",
5656
"types-requests>=2.32.0.20241016",

pytest.ini

Lines changed: 0 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -4,8 +4,6 @@ testpaths = tests
44

55
markers =
66
# General
7-
polarion: Store polarion test ID
8-
97
skip_on_disconnected: Mark tests that can only be run in deployments with Internet access i.e. not on disconnected clusters.
108
parallel: marks tests that can run in parallel along with pytest-xdist
119

tests/cluster_health/README.md

Lines changed: 56 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,56 @@
1+
# Cluster Health Tests
2+
3+
This directory contains foundational health check tests for OpenDataHub/RHOAI clusters. These tests serve as prerequisites to ensure the cluster and operators are in a healthy state before running more complex integration tests.
4+
5+
## Directory Structure
6+
7+
```text
8+
cluster_health/
9+
├── test_cluster_health.py # Cluster node health validation
10+
└── test_operator_health.py # Operator and pod health validation
11+
```
12+
13+
### Current Test Suites
14+
15+
- **`test_cluster_health.py`** - Validates that all cluster nodes are healthy and schedulable
16+
- **`test_operator_health.py`** - Validates that DSCInitialization, DataScienceCluster resources are ready, and all pods in operator/application namespaces are running
17+
18+
## Test Markers
19+
20+
Tests use the following markers defined in `pytest.ini`:
21+
22+
- `@pytest.mark.cluster_health` - Tests that verify the cluster is healthy to begin testing
23+
- `@pytest.mark.operator_health` - Tests that verify OpenDataHub/RHOAI operators are healthy and functioning correctly
24+
25+
## Test Details
26+
27+
### Cluster Node Health (`test_cluster_health.py`)
28+
29+
- **`test_cluster_node_healthy`** - Asserts all cluster nodes have `KubeletReady: True` condition and are schedulable (not cordoned)
30+
31+
### Operator Health (`test_operator_health.py`)
32+
33+
- **`test_data_science_cluster_initialization_healthy`** - Validates the DSCInitialization resource reaches `READY` status (120s timeout)
34+
- **`test_data_science_cluster_healthy`** - Validates the DataScienceCluster resource reaches `READY` status (120s timeout)
35+
- **`test_pods_cluster_healthy`** - Validates all pods in operator and application namespaces reach Running/Completed state (180s timeout). Parametrized across `operator_namespace` and `applications_namespace` from global config
36+
37+
## Running Tests
38+
39+
### Run All Cluster Health Tests
40+
41+
```bash
42+
uv run pytest tests/cluster_health/
43+
```
44+
45+
### Run by Marker
46+
47+
```bash
48+
# Run cluster node health tests
49+
uv run pytest -m cluster_health
50+
51+
# Run operator health tests
52+
uv run pytest -m operator_health
53+
54+
# Run both
55+
uv run pytest -m "cluster_health or operator_health"
56+
```

tests/cluster_health/test_cluster_health.py

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,7 +1,8 @@
11
import pytest
22
from ocp_resources.node import Node
33
from ocp_utilities.infra import assert_nodes_in_healthy_condition, assert_nodes_schedulable
4-
from simple_logger.logger import get_logger
4+
5+
from utilities.opendatahub_logger import get_logger
56

67
LOGGER = get_logger(name=__name__)
78

tests/cluster_health/test_operator_health.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -3,10 +3,10 @@
33
from ocp_resources.data_science_cluster import DataScienceCluster
44
from ocp_resources.dsc_initialization import DSCInitialization
55
from pytest_testconfig import config as py_config
6-
from simple_logger.logger import get_logger
76

87
from utilities.general import wait_for_pods_running
98
from utilities.infra import wait_for_dsc_status_ready, wait_for_dsci_status_ready
9+
from utilities.opendatahub_logger import get_logger
1010

1111
LOGGER = get_logger(name=__name__)
1212

tests/conftest.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -37,7 +37,6 @@
3737
from pytest import Config, FixtureRequest
3838
from pytest_testconfig import config as py_config
3939
from semver import Version
40-
from simple_logger.logger import get_logger
4140

4241
from utilities.certificates_utils import create_ca_bundle_file
4342
from utilities.constants import (
@@ -65,6 +64,7 @@
6564
from utilities.logger import RedactedString
6665
from utilities.mariadb_utils import wait_for_mariadb_operator_deployments
6766
from utilities.minio import create_minio_data_connection_secret
67+
from utilities.opendatahub_logger import get_logger
6868
from utilities.operator_utils import get_cluster_service_version, get_csv_related_images
6969
from utilities.serving_runtime import get_runtime_image_from_template
7070
from utilities.user_utils import get_byoidc_issuer_url, get_oidc_tokens

tests/fixtures/README.md

Lines changed: 74 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,74 @@
1+
# Shared Test Fixtures
2+
3+
This directory contains shared pytest fixtures that are used across multiple test modules. These fixtures are automatically loaded via pytest's plugin mechanism, registered in `/tests/conftest.py`.
4+
5+
## Directory Structure
6+
7+
```text
8+
fixtures/
9+
├── files.py # File storage provider fixtures
10+
├── guardrails.py # Guardrails orchestrator infrastructure fixtures
11+
├── inference.py # Inference service and serving runtime fixtures
12+
├── trustyai.py # TrustyAI operator and DSC configuration fixtures
13+
└── vector_io.py # Vector database provider deployment fixtures
14+
```
15+
16+
### Fixture Modules
17+
18+
- **`files.py`** - Factory fixture for configuring file storage providers (local, S3/MinIO)
19+
- **`guardrails.py`** - Fixtures for deploying and configuring the Guardrails Orchestrator, including pods, routes, health checks, and gateway configuration
20+
- **`inference.py`** - Fixtures for vLLM CPU serving runtimes, InferenceServices (Qwen), LLM-d inference simulator, and KServe controller configuration
21+
- **`trustyai.py`** - Fixtures for TrustyAI operator deployment and DataScienceCluster LMEval configuration
22+
- **`vector_io.py`** - Factory fixture for deploying vector database providers (Milvus, Faiss, PGVector, Qdrant) with their backing services and configuration
23+
24+
## Registration
25+
26+
All fixture modules are registered as pytest plugins in `/tests/conftest.py`:
27+
28+
```python
29+
pytest_plugins = [
30+
"tests.fixtures.inference",
31+
"tests.fixtures.guardrails",
32+
"tests.fixtures.trustyai",
33+
"tests.fixtures.vector_io",
34+
"tests.fixtures.files",
35+
]
36+
```
37+
38+
## Usage
39+
40+
Fixtures are automatically available to all tests. Factory fixtures accept parameters via `pytest.mark.parametrize` with `indirect=True`.
41+
42+
### Vector I/O Provider Example
43+
44+
```python
45+
@pytest.mark.parametrize(
46+
"vector_io_provider_deployment_config_factory",
47+
["milvus", "pgvector", "qdrant-remote"],
48+
indirect=True,
49+
)
50+
def test_with_vector_db(vector_io_provider_deployment_config_factory):
51+
# Fixture deploys the provider and returns env var configuration
52+
...
53+
```
54+
55+
### Supported Vector I/O Providers
56+
57+
| Provider | Type | Description |
58+
| --------------- | ------ | ------------------------------------------- |
59+
| `milvus` | Local | In-memory Milvus (no external dependencies) |
60+
| `milvus-remote` | Remote | Milvus standalone with etcd backend |
61+
| `faiss` | Local | Facebook AI Similarity Search (in-memory) |
62+
| `pgvector` | Local | PostgreSQL with pgvector extension |
63+
| `qdrant-remote` | Remote | Qdrant vector database |
64+
65+
### Supported File Providers
66+
67+
| Provider | Description |
68+
| -------- | ---------------------------------- |
69+
| `local` | Local filesystem storage (default) |
70+
| `s3` | S3/MinIO remote object storage |
71+
72+
## Adding New Fixtures
73+
74+
When adding shared fixtures, place them in the appropriate module file (or create a new one), and register the new module in `/tests/conftest.py` under `pytest_plugins`. Follow the project's fixture conventions: use noun-based names, narrowest appropriate scope, and context managers for resource lifecycle.

tests/fixtures/inference.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -13,7 +13,6 @@
1313
from ocp_resources.service import Service
1414
from ocp_resources.serving_runtime import ServingRuntime
1515
from pytest_testconfig import py_config
16-
from simple_logger.logger import get_logger
1716
from timeout_sampler import retry
1817

1918
from utilities.constants import (
@@ -24,6 +23,7 @@
2423
)
2524
from utilities.inference_utils import create_isvc
2625
from utilities.infra import get_data_science_cluster, wait_for_dsc_status_ready
26+
from utilities.opendatahub_logger import get_logger
2727
from utilities.serving_runtime import ServingRuntimeFromTemplate
2828

2929
LOGGER = get_logger(name=__name__)

0 commit comments

Comments
 (0)