Skip to content

Do I need to deploy an LLM as an accessory (to build a a simple chatbot)? #600

Answered by charnould
charnould asked this question in Q&A
Discussion options

You must be logged in to vote

Just create an accessory (and boot it):

# in deploy.yml
accessories:
  ollama:
    service: ollama
    image: ollama/ollama
    host: your-ip

Then in app code:

export const my_function = async () => {
  await fetch('http://ollama:11434/api/pull', {
    method: 'POST',
    headers: { 'Content-Type': 'application/json' },
    body: JSON.stringify({ model: '...' })
  })
}

Replies: 1 comment

Comment options

You must be logged in to vote
0 replies
Answer selected by charnould
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
1 participant