This file provides guidance to agents or humans when working with code in this repository.
This is an OpenAI API client library written in Rust, designed with a modular architecture that separates concerns between request managers and type definitions. The library implements a chain-style API pattern similar to modern cloud SDKs.
openai-rust/src/
├── client.rs # Main Client with builder pattern
├── completion.rs # Chat completions request manager (singular)
├── model.rs # Models API request manager (singular)
├── response.rs # Responses API request manager (singular)
├── types/ # Type definitions (plural naming)
│ ├── completions.rs # Chat completion types
│ ├── models.rs # Models API types
│ └── responses.rs # Responses API types
Key Design Patterns:
- Business logic files use singular naming (
completion.rs,model.rs,response.rs) - Type definition files use plural naming (
completions.rs,models.rs,responses.rs) - Chain-style API:
client.completions().create(&request).await - Builder pattern:
Client::builder().api_key(key).base_url(url).build()?
Each API endpoint has its own manager struct:
Completion<'a>- handles chat completions (both regular and streaming)Model<'a>- handles model listingResponse<'a>- handles the new OpenAI Responses API
Managers access client fields through pub(crate) getter methods: api_key(), base_url(), http_client().
- All types are re-exported from
types::*inlib.rs - Reasoning field support: Both
reasoningandreasoning_contentfields are supported via serde aliases - Stream types use
async-streamfor Server-Sent Events parsing
# Build the library
cargo build
# Check compilation (faster)
cargo check
# Run specific examples
# Chat Completions examples
cargo run --bin completions-chat
cargo run --bin completions-streaming
# Models example
cargo run --bin models-example
# Responses API examples
cargo run --bin responses-chat
cargo run --bin responses-streaming
# Run example from its directory
cd examples/chat-completions && cargo run --bin completions-chatThis is a Cargo workspace with:
- Main library:
openai-rust/ - Examples:
examples/*/(each is its own crate)
- Create type definitions in
types/[endpoint]s.rs(plural) - Create request manager in
[endpoint].rs(singular) - Add manager struct with
new()and request methods - Add manager accessor method to
Client - Update
lib.rsmodule declarations - Create example in
examples/[endpoint]/
The library automatically handles both reasoning and reasoning_content field names from different OpenAI-compatible providers using serde field aliases. Users only need to access the unified reasoning field.
Examples expect:
OPENAI_API_KEY- API keyOPENAI_BASE_URL- Base URL (optional, defaults vary by example)
Load via .env file using dotenvy::dotenv().
Core Reference Materials:
openapi.documented.yml- Official OpenAI API specification (primary reference)- OpenAI official documentation at platform.openai.com
- DeepSeek and OpenRouter documentation for compatibility verification
Development Workflow:
- Compilation: Always compile after code changes with
cargo checkorcargo build - Testing: Write unit tests for new functionality, run with
cargo test - Examples: Create examples for each API endpoint in
examples/[name]/ - Documentation: Keep docs concise and current - ask before modifying documentation
API Coverage Status: Currently implemented: Chat Completions, Models List, Responses API Remaining APIs from openapi.yml: Assistants, Audio, Embeddings, Files, Images, Fine-tuning, Batch, Moderations, etc.
Compatibility Notes: Ensure compatibility with OpenAI-compatible providers (DeepSeek, OpenRouter) by supporting variant field names and response formats.