Local LLM on iOS? #1032
Local LLM on iOS?
#1032
-
Is it possible to use a local llm on an iOS /ipadOS device? |
Beta Was this translation helpful? Give feedback.
Answered by
logancyang
Jan 10, 2025
Replies: 1 comment
-
You can run Ollama or LM studio on desktop and serve it on local network. That's the only option I know but there may be other ways. |
Beta Was this translation helpful? Give feedback.
0 replies
Answer selected by
PixeroJan
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
You can run Ollama or LM studio on desktop and serve it on local network. That's the only option I know but there may be other ways.