-
Notifications
You must be signed in to change notification settings - Fork 60
Open
Description
I've installed this plugin successfully using Lazy, and I'm trying to set it up using a locally hosted OpenAI-compatible API, hosted via LM Studio. This is the configuration I'm using:
return {
{
'huggingface/llm.nvim',
opts = {
{
api_token = nil,
backend = "openai",
model = "qwen2.5-coder-32b-instruct",
url = "http://localhost:1234/v1/completions",
tokenizer = {
path = "/Users/[USERNAME]/.tokenizer.json"
},
request_body = {
temperature = 0.2,
top_p = 0.95,
}
}
}
}
}The plugin is not communicating with my self-hosted API at all, and is instead (successfully) using HuggingFace Inference API.
Metadata
Metadata
Assignees
Labels
No labels