Skip to content

tejas-2552/SpringAi-Ollama-Application

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

3 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Text generation with LLM via Ollama.

Running the application

The application need Ollama for providing LLMs. You can run Ollama locally on your laptop (macOS or Linux) by installing. Ollama as a native application

First, make sure you have Ollama installed on your laptop (macOS or Linux). Then, use Ollama to run the wizard-vicuna-uncensored:7b large language model. ollama run wizard-vicuna-uncensored:7b

Finally, run the Spring Boot application. Calling the application

http :9001/ollama/promt method :GET

http :9001/ollama/promt/Why sky is blue?

About

Spring AI - Simplifying AI Integration for Java Spring Applications

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages