asimov is a Rust library to build high performance LLM based applications. The crate is divided into the following modules:
ioStructured input and output parsing, stream processing.dbAbstractions to interact with vector databases.modelsAbstractions to interact with LLMs.tokenizersTokenizer utilities
Here's a simple example using asimov to generate a response from an LLM. See the examples directory for more examples.
use asimov::prelude::*;
#[tokio::main]
async fn main() -> Result<(), Box<dyn std::error::Error>> {
std::env::var("OPENAI_API_KEY").expect("OPENAI_API_KEY must be set");
let gpt35 = OpenAiLlm::default();
let response: RawString = gpt35.generate("How are you doing").await?;
println!("Response: {}", response.0);
Ok(())
}- Structured parsing from streams
cargo run --example structured_streaming --features openai -- --nocapture
- Basic streaming
cargo run --example streaming --features openai -- --nocapture
- Chaining two LLMs
cargo run --example simple_chain --features openai
- Few shot generation
cargo run --example simple_fewshot --features openai
Add asimov as a dependency in your Cargo.toml
cargo add asimov
or
[dependencies]
asimov = "0.1.0"git clone https://github.com/yourusername/asimov.git
cd asimovcargo build- I/O module
cargo test io- Vector DB module
cargo test db- Tokenizers module
cargo test tokenizers- Models module
cargo test models features --openaiThe following optional features can be enabled:
openaiEnables use of theasync-openaicrate.qdrantEnables the use of theqdrant-clientcrate.
To enable a feature, use the --features flag when building or running:
cargo build --features "openai qdrant"