-
-
Notifications
You must be signed in to change notification settings - Fork 117
Open
Labels
bugSomething isn't workingSomething isn't working
Description
Describe the bug
Ollama doesn't see the AMD plugin even though it is installed. The GPU will not be used.
Workaround
If I manually copy the folder:
/var/lib/flatpak/runtime/com.jeffser.Alpaca.Plugins.AMD/x86_64/stable/active/files/lib/ollama/rocm to
/var/lib/flatpak/runtime/com.jeffser.Alpaca.Plugins.Ollama/x86_64/stable/active/files/lib/ollama/ it finds the GPU again.
There also seems to be another issue with the ROCm libraries because it crashes with large models when the model should overflow into RAM:
rocblaslt error: Could not load /app/plugins/Ollama/lib/ollama/rocm/hipblaslt/library/TensileLibrary_lazy_gfx1201.dat
hipBLASLt error: Heuristic Fetch Failed!
This message will be only be displayed once, unless the ROCBLAS_VERBOSE_HIPBLASLT_ERROR environment variable is set.
rocBLAS warning: hipBlasLT failed, falling back to tensile.
This message will be only be displayed once, unless the ROCBLAS_VERBOSE_TENSILE_ERROR environment variable is set.
graph_reserve: failed to allocate compute buffers
SIGSEGV: segmentation violation
PC=0x7f5813988faa m=9 sigcode=1 addr=0x7f5a02823ce8
signal arrived during cgo execution
barstown
Metadata
Metadata
Assignees
Labels
bugSomething isn't workingSomething isn't working