Skip to content

Commit fa94225

Browse files
authored
Merge pull request opendatahub-io#7 from zdtsw-forking/chore_fix_lint
update: konflux build
2 parents 05dfa52 + 9c641c4 commit fa94225

File tree

4 files changed

+1539
-16
lines changed

4 files changed

+1539
-16
lines changed

services/uds_tokenizer/Dockerfile.konflux

Lines changed: 10 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -24,7 +24,7 @@ WORKDIR /app
2424
# Run as root for build stage
2525
USER root
2626

27-
# use requirments.txt for konflux
27+
# use requirements.txt for konflux
2828
COPY requirements.txt /app/requirements.txt
2929
RUN pip install --no-cache-dir -r requirements.txt
3030

@@ -72,3 +72,12 @@ EXPOSE ${HEALTH_PORT}
7272

7373
# Startup command: run direct gRPC server
7474
CMD ["python", "/app/run_grpc_server.py"]
75+
76+
LABEL com.redhat.component="odh-llm-d-kv-cache-rhel9" \
77+
name="odh-llm-d-kv-cache-rhel9" \
78+
description="odh-llm-d-kv-cache" \
79+
summary="odh-llm-d-kv-cache" \
80+
maintainer="['managed-open-data-hub@redhat.com']" \
81+
io.k8s.display-name="odh-llm-d-kv-cache" \
82+
io.k8s.description="odh-llm-d-kv-cache"
83+
com.redhat.license_terms="https://www.redhat.com/licenses/Red_Hat_Standard_EULA_20191108.pdf"

services/uds_tokenizer/README.md

Lines changed: 43 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -246,6 +246,40 @@ From the repository root:
246246
make uds-tokenizer-service-test
247247
```
248248

249+
## Building Container Images
250+
251+
The project provides two Dockerfiles for different deployment targets:
252+
253+
### ODH (Open Data Hub) Build
254+
255+
Uses standard Python base image and `pyproject.toml`:
256+
257+
```bash
258+
podman build -f Dockerfile .
259+
```
260+
261+
### RHDS Build
262+
263+
Uses Red Hat UBI base image and `requirements.txt`:
264+
265+
```bash
266+
podman build -f Dockerfile.konflux .
267+
```
268+
269+
### Dependency Management
270+
271+
- **pyproject.toml**: Defines direct dependencies for the project
272+
- **requirements.txt**: Generated from `uv.lock` for Konflux builds only
273+
274+
To regenerate `requirements.txt` after updating dependencies:
275+
276+
```bash
277+
# Update dependencies in pyproject.toml
278+
# Then regenerate the lock file and requirements.txt
279+
uv lock
280+
uv export --format requirements-txt --no-hashes --output-file requirements.txt
281+
```
282+
249283
## Kubernetes Deployment
250284

251285
The service is designed to run in Kubernetes with:
@@ -273,21 +307,25 @@ See [tokenizers/README.md](tokenizers/README.md) for detailed information about
273307
```
274308
├── run_grpc_server.py # Main gRPC server entry point
275309
├── tokenizer_grpc_service.py # gRPC service implementation
276-
├── pyproject.toml # Dependencies and package config
310+
├── Dockerfile # Container build file (ODH)
311+
├── Dockerfile.konflux # Container build file (RHDS/Konflux)
312+
├── pyproject.toml # Direct dependencies (used by Dockerfile)
313+
├── requirements.txt # Generated from uv.lock (used by Dockerfile.konflux)
277314
├── tokenizer_service/ # Core tokenizer service implementation
278315
│ ├── __init__.py
279316
│ ├── tokenizer.py # Tokenizer service implementation
280317
│ └── exceptions.py # Custom exceptions
281-
├── tokenizerpb/ # gRPC service definition
318+
├── tokenizerpb/ # gRPC service definition
282319
│ ├── tokenizer_pb2_grpc.py
283320
│ └── tokenizer_pb2.py
284321
├── utils/ # Utility functions
285322
│ ├── __init__.py
286-
│ └── logger.py # Logger functionality
323+
│ ├── logger.py # Logger functionality
324+
│ └── thread_pool_utils.py # Thread pool utilities
287325
├── tests/ # Test files
288326
│ ├── __init__.py
289-
│ ├── conftest.py # Shared fixtures (in-process gRPC server)
290-
── test_integration.py # Integration tests (pytest)
327+
│ ├── conftest.py # Shared fixtures (in-process gRPC server)
328+
── test_integration.py # Integration tests (pytest)
291329
├── tokenizers/ # Tokenizer files (downloaded automatically)
292330
└── README.md # This file
293331
```
Lines changed: 122 additions & 10 deletions
Original file line numberDiff line numberDiff line change
@@ -1,13 +1,125 @@
1-
pydantic==2.11.7
2-
shortuuid==1.0.13
3-
transformers==4.53.0
4-
safetensors==0.5.3
5-
Jinja2==3.1.6
6-
modelscope
7-
huggingface-hub
1+
# This file was autogenerated by uv via the following command:
2+
# uv export --format requirements-txt --no-hashes --output-file requirements.txt
83
aiohttp==3.9.5
9-
protobuf==6.31.1
10-
tiktoken>=0.7.0
4+
# via uds-tokenizer
5+
aiosignal==1.4.0
6+
# via aiohttp
7+
annotated-types==0.7.0
8+
# via pydantic
9+
attrs==25.4.0
10+
# via aiohttp
11+
certifi==2026.2.25
12+
# via requests
13+
charset-normalizer==3.4.4
14+
# via requests
15+
colorama==0.4.6 ; sys_platform == 'win32'
16+
# via tqdm
17+
filelock==3.25.0
18+
# via
19+
# huggingface-hub
20+
# modelscope
21+
# transformers
22+
frozenlist==1.8.0
23+
# via
24+
# aiohttp
25+
# aiosignal
26+
fsspec==2026.2.0
27+
# via huggingface-hub
1128
grpcio==1.76.0
29+
# via
30+
# grpcio-reflection
31+
# grpcio-tools
32+
# uds-tokenizer
33+
grpcio-reflection==1.76.0
34+
# via uds-tokenizer
1235
grpcio-tools==1.76.0
13-
grpcio-reflection==1.76.0
36+
# via uds-tokenizer
37+
hf-xet==1.3.2 ; platform_machine == 'aarch64' or platform_machine == 'amd64' or platform_machine == 'arm64' or platform_machine == 'x86_64'
38+
# via huggingface-hub
39+
huggingface-hub==0.36.2
40+
# via
41+
# tokenizers
42+
# transformers
43+
# uds-tokenizer
44+
idna==3.11
45+
# via
46+
# requests
47+
# yarl
48+
jinja2==3.1.6
49+
# via uds-tokenizer
50+
markupsafe==3.0.3
51+
# via jinja2
52+
modelscope==1.34.0
53+
# via uds-tokenizer
54+
multidict==6.7.1
55+
# via
56+
# aiohttp
57+
# yarl
58+
numpy==2.4.2
59+
# via transformers
60+
packaging==26.0
61+
# via
62+
# huggingface-hub
63+
# transformers
64+
propcache==0.4.1
65+
# via yarl
66+
protobuf==6.31.1
67+
# via
68+
# grpcio-reflection
69+
# grpcio-tools
70+
# uds-tokenizer
71+
pydantic==2.11.7
72+
# via uds-tokenizer
73+
pydantic-core==2.33.2
74+
# via pydantic
75+
pyyaml==6.0.3
76+
# via
77+
# huggingface-hub
78+
# transformers
79+
regex==2026.2.28
80+
# via
81+
# tiktoken
82+
# transformers
83+
requests==2.32.5
84+
# via
85+
# huggingface-hub
86+
# modelscope
87+
# tiktoken
88+
# transformers
89+
safetensors==0.5.3
90+
# via
91+
# transformers
92+
# uds-tokenizer
93+
setuptools==82.0.0
94+
# via
95+
# grpcio-tools
96+
# modelscope
97+
shortuuid==1.0.13
98+
# via uds-tokenizer
99+
tiktoken==0.12.0
100+
# via uds-tokenizer
101+
tokenizers==0.21.4
102+
# via transformers
103+
tqdm==4.67.3
104+
# via
105+
# huggingface-hub
106+
# modelscope
107+
# transformers
108+
transformers==4.53.0
109+
# via uds-tokenizer
110+
typing-extensions==4.15.0
111+
# via
112+
# aiosignal
113+
# grpcio
114+
# huggingface-hub
115+
# pydantic
116+
# pydantic-core
117+
# typing-inspection
118+
typing-inspection==0.4.2
119+
# via pydantic
120+
urllib3==2.6.3
121+
# via
122+
# modelscope
123+
# requests
124+
yarl==1.23.0
125+
# via aiohttp

0 commit comments

Comments
 (0)