Skip to content

Commit c33f74d

Browse files
committed
Bump to 0.5.18
- Fix image token counting (base64 no longer counted as text tokens) - Disable CacheAligner (was inflating tokens and breaking prefix caching) - Fix WebSocket SSL on Windows (native websockets SSL handling) - Add transforms_summary to API responses - Extract ProxyConfig to proxy/models.py (server.py refactor step 1)
1 parent f40dc2f commit c33f74d

File tree

2 files changed

+2
-2
lines changed

2 files changed

+2
-2
lines changed

headroom/__init__.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -153,7 +153,7 @@
153153
TransformPipeline,
154154
)
155155

156-
__version__ = "0.5.17"
156+
__version__ = "0.5.18"
157157

158158
__all__ = [
159159
# Main client

pyproject.toml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -4,7 +4,7 @@ build-backend = "hatchling.build"
44

55
[project]
66
name = "headroom-ai"
7-
version = "0.5.17"
7+
version = "0.5.18"
88
description = "The Context Optimization Layer for LLM Applications - Cut costs by 50-90%"
99
readme = "README.md"
1010
license = "Apache-2.0"

0 commit comments

Comments
 (0)