-
Notifications
You must be signed in to change notification settings - Fork 8.2k
Description
Bug Description
Ollama embedding component does not show the model listed
Reproduction
Ollama embedding component does not show the model listed
curl http://localhost:11434/
Ollama is running
curl http://localhost:11434/api/tags
{
"models": [
{
"name": "gemma3:4b",
"model": "gemma3:4b",
"modified_at": "2025-12-21T15:43:44.3658294+05:30",
"size": 3338801804,
"digest": "a2af6cc3eb7fa8be8504abaf9b04e88f17a119ec3f04a3addf55f92841195f5a",
"details": {
"parent_model": "",
"format": "gguf",
"family": "gemma3",
"families": [
"gemma3"
],
"parameter_size": "4.3B",
"quantization_level": "Q4_K_M"
}
},
{
"name": "deepseek-r1:8b",
"model": "deepseek-r1:8b",
"modified_at": "2025-12-21T11:44:46.4384762+05:30",
"size": 5225376047,
"digest": "6995872bfe4c521a67b32da386cd21d5c6e819b6e0d62f79f64ec83be99f5763",
"details": {
"parent_model": "",
"format": "gguf",
"family": "qwen3",
"families": [
"qwen3"
],
"parameter_size": "8.2B",
"quantization_level": "Q4_K_M"
}
}
]
}
There are 2 models installed however embedding does not show them. Rfreshed multiple times but still not working
Expected behavior
Models should come in dropdown
Who can help?
No response
Operating System
windows
Langflow Version
latest
Python Version
None
Screenshot
No response
Flow File
No response