-
-
Notifications
You must be signed in to change notification settings - Fork 13.3k
Open
Description
brew config AND brew doctor output OR brew gist-logs <formula> link
brew config
HOMEBREW_VERSION: 5.0.8
ORIGIN: https://github.com/Homebrew/brew
HEAD: b473452cf25e5f70522229d1472948584b74736b
Last commit: 4 days ago
Branch: stable
Core tap JSON: 01 Jan 04:26 UTC
HOMEBREW_PREFIX: /home/linuxbrew/.linuxbrew
HOMEBREW_CASK_OPTS: []
HOMEBREW_DISPLAY: :0
HOMEBREW_DOWNLOAD_CONCURRENCY: 32
HOMEBREW_EDITOR: vim
HOMEBREW_FORBID_PACKAGES_FROM_PATHS: set
HOMEBREW_MAKE_JOBS: 16
HOMEBREW_NO_ANALYTICS: set
HOMEBREW_NO_AUTO_UPDATE: set
HOMEBREW_NO_ENV_HINTS: set
Homebrew Ruby: 3.4.8 => /home/linuxbrew/.linuxbrew/Homebrew/Library/Homebrew/vendor/portable-ruby/3.4.8/bin/ruby
CPU: 16-core 64-bit rocketlake
Clang: N/A
Git: 2.43.0 => /bin/git
Curl: 8.5.0 => /bin/curl
Kernel: Linux 6.6.87.2-microsoft-standard-WSL2 x86_64 GNU/Linux
OS: Ubuntu 24.04.3 LTS (noble)
WSL: 2 (Microsoft Store)
Host glibc: 2.39
Host libstdc++: 6.0.33
/usr/bin/gcc: 13.3.0
/usr/bin/ruby: N/A
glibc: N/A
gcc@12: N/A
gcc: N/A
xorg: N/A
brew doctor
Your system is ready to brew.
brew gist-logs ollama
Error: No logs.Verification
- My
brew doctoroutput saysYour system is ready to brew.and am still able to reproduce my issue. - I ran
brew updateand am still able to reproduce my issue. - I have resolved all warnings from
brew doctorand that did not fix my problem. - I searched for recent similar issues at https://github.com/Homebrew/homebrew-core/issues?q=is%3Aissue and found no duplicates.
- My issue is not about a failure to build a formula from source.
What were you trying to do (and why)?
I want to use ollama to run a local LLM model.
What happened (include all command output)?
/home/linuxbrew/.linuxbrew/bin/ollama serve 2>&1 | grep "inference compute"
time=2026-01-02T06:12:04.010-03:00 level=INFO source=types.go:60 msg="inference compute" id=cpu library=cpu compute="" name=cpu description=cpu libdirs=ollama driver="" pci_id="" type="" total="15.5 GiB" available="14.9 GiB"GPU is not detected.
What did you expect to happen?
/usr/local/bin/ollama serve 2>&1 | grep "inference compute"
time=2026-01-02T06:12:49.162-03:00 level=INFO source=types.go:42 msg="inference compute" id=GPU-49b4f306-b610-0f3c-ff52-e2c0a86020bd filter_id="" library=CUDA compute=7.5 name=CUDA0 description="NVIDIA GeForce RTX 2080 Ti" libdirs=ollama,cuda_v13 driver=13.0 pci_id=0000:01:00.0 type=discrete total="11.0 GiB" available="9.8 GiB"Using ollama from https://ollama.com, GPU is correctly detected.
Step-by-step reproduction instructions (by running brew commands)
# install
brew install ollama
# use the version installed by Homebrew
/home/linuxbrew/.linuxbrew/bin/ollama serve 2>&1 | grep "inference compute"# install
curl -fsSL https://ollama.com/install.sh | sh
# use the version installed by the script
/usr/local/bin/ollama serve 2>&1 | grep "inference compute"From what i can gather the version of ollama provided by homebrew does not include any GGML backends.
Metadata
Metadata
Assignees
Labels
No labels