A Rust client library for the OpenAI API. This library provides a native Rust interface for interacting with OpenAI's services, including support for OpenAI-compatible providers like DeepSeek and OpenRouter.
- Chain-style API for easy interaction
- Builder pattern for client configuration
- Streaming support using async streams
- Support for completions, models, and responses API
- Compatible with OpenAI, DeepSeek, OpenRouter, and more
Add this to your Cargo.toml:
[dependencies]
openai-rust = { git = "https://github.com/wanggang316/openai-rust", tag = "v0.1.0" }use openai_rust::Client;
#[tokio::main]
async fn main() -> Result<(), Box<dyn std::error::Error>> {
let client = Client::builder()
.api_key("your-api-key")
.build()?;
let response = client.completions()
.create(&request)
.await?;
println!("{:?}", response);
Ok(())
}- Rust 1.85 or later
- Cargo (comes with Rust)
cargo build --releasecargo run --example chat-completions
cargo run --example streaming
cargo run --example modelscargo testSee CONTRIBUTING.md for contribution guidelines and AGENTS.md for project-specific rules (for both humans and agents).
This project is licensed under the MIT License.
Built based on the official OpenAI API specification.