Liminal is a work in progress. The documentation may not reflect the current implementation.
Liminal is a toolkit for composing conversation trees with language models using TypeScript iterators. Model a conversation with building blocks for appending messages, inferring replies, focusing new language models, emitting events to observers and branching conversation strands.
- Documentation →
Usage guide intended for human readers. - Examples →
Examples illustrating common use cases. - llms.txt →
Chunks of truth to be fed into LLMs.
Model a conversation as a generator function. Yield Liminal "runes" to interact with the underlying state of the conversation strand.
import { L } from "liminal"
import { adapter } from "liminal-ollama"
await L.run(
function*() {
// Kick off the conversation.
yield* L.focus(adapter("gemma3:1b"))
yield* L.user`Decide on a topic for us to discuss.`
yield* L.assistant
// Loop through some conversation turns.
let i = 0
while (i++ < 3) {
// Have the language model respond to itself in an isolated copy of the current "strand."
const reply = yield* L.strand(function*() {
yield* L.user`Please reply to the last message on my behalf.`
return yield* L.assistant
})
// Use the child strand's return value to append a user message within the root "strand."
yield* L.user(reply)
yield* L.assistant
}
yield* L.user`Summarize key points from our conversation.`
return yield* L.assistant
},
{ handler: console.log },
)
Note:
async function*() { // ...
is perfectly valid if you wish to use await promises.
-
Clone and build Liminal.
git clone [email protected]:harrysolovay/liminal.git cd liminal bun i bun run build
-
Configure any environment variables used by the example.
-
Run the example script.
bun examples/<path-to-example>
Please ensure you adhere to our code of conduct when interacting in this repository.
Contributions are welcome and appreciated! Check out the contributing guide before you dive in.
Liminal is Apache-licensed.