Skip to content

andrei-punko/spring-ai-ollama

Repository files navigation

AI chat with Ollama model using Spring AI

Java CI with Gradle

Prerequisites

  • JDK 21
  • Gradle isn't required because of embedded Gradle in the project
  • Docker (used by unit tests and as preferred way to start Ollama)
  • Ollama (install it from official site or use it inside Docker container)
  • NVIDIA GPU (recommended) (checked on GeForce RTX 3060 12Gb)
    • On Linux to use GPU you need to install nvidia-container-toolkit according to article

How to build application

./gradlew clean build

How to start Docker containers with Ollama model and Spring AI application

docker compose -f docker-compose-prod.yml up

or use run-all.bat script

How to ask question to AI model

Send POST request to /api/generate endpoint exposed by service with question inside prompt field of request body. For example:

curl -i -H 'Content-Type: application/json' \
  -d '{ "prompt": "Tell me about Belarus" }' \
  -X POST http://localhost:8090/api/generate
curl -i -H 'Content-Type: application/json' \
  -d '{ "prompt": "Describe primitive types in Java" }' \
  -X POST http://localhost:8090/api/generate
curl -i -H 'Content-Type: application/json' \
  -d '{ "prompt": "Write code of bubble sort using Java" }' \
  -X POST http://localhost:8090/api/generate

Or you could use prepared collection of Postman requests from postman folder. Just import them into your Postman

Video with description of the project

YouTube link

Appendix: communication with Ollama started inside a Docker container

According to instruction

Run Ollama inside a Docker container (using NVIDIA GPU) (preferred)

docker run -d --gpus=all -v ollama:/root/.ollama -p 11434:11434 --name ollama_container ollama/ollama

or

Run Ollama inside a Docker container (using CPU)

docker run -d -v ollama:/root/.ollama -p 11434:11434 --name ollama_container ollama/ollama

Pull an AI model into Ollama Docker container

docker exec -it ollama_container ollama pull gemma3:4b

List AI models inside Ollama Docker container

docker exec -it ollama_container ollama list

Stop Docker container

docker stop ollama_container && docker rm ollama_container

About

Access to AI model hosted in local Ollama using Spring AI

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published