Skip to content

Intuit-A4A/backend-java-player-service

Repository files navigation

⚾ Player Service

Player Service is a backend application that serves baseball player data. In addition, Player service integrates with Ollama, which allows us to run the tinyllama LLM locally.

Dependencies

🛠️ Setup Instructions

  1. Verify system dependencies

    1. Java 17
      • Verify installation: java -version
    2. Maven
      • Download and install from maven.apache.org
      • Verify installation, run: mvn --version
      • Verify java version linked to maven is Java 17 Java version: 17.x.x
    3. Container Manager
      • Download and install from docker.com(recommended) or podman (alternative)
      • Verify installation, run: docker --version for docker
  2. Clone this repository or Download the code as zip

    • run git clone https://github.com/Intuit-A4A/backend-java-player-service.git

Run the application

Part 1: Application Dependencies

  1. Install application dependencies
    • From the project's root directory, run: mvn clean install -DskipTests

Part 2: Run Player Service (without LLM)

  1. Start the Player service

    mvn spring-boot:run
  2. Verify the Player service is running

    1. Open your browser and visit http://localhost:8080/v1/players
    2. If the application is running successfully, you will see player data appear in the browser

Part 3: Start LLM Docker Container

Player service integrates with Ollama 🦙, which allows us to run LLMs locally. This app runs tinyllama model.

  1. Pull and run Ollama docker image and download tinyllama model

    • Pull Ollama docker image
    docker pull ollama/ollama
  2. Run Ollama docker image on port 11434 as a background process

    docker run -d -v ollama:/root/.ollama -p 11434:11434 --name ollama ollama/ollama
  3. Download and run tinyllama model

    docker exec -it ollama ollama run tinyllama
  4. Test Ollama API server

    curl -v --location 'http://localhost:11434/api/generate' --header 'Content-Type: application/json' --data '{"model": "tinyllama","prompt": "why is the sky blue?", "stream": false}'
    

Having trouble with docker? Try using podman as an alternative. Instructions here

Part 4: Verify Player Service and LLM Integration

  1. Ensure Player Service is running from previous instructions. If not:

    mvn spring-boot:run
  2. Open your browser and visit http://localhost:8080/v1/chat/list-models

    • If the application is running successfully, you will see a json response that include information about tinyllama

About

Player service app built with Java Spring Boot framework

Resources

License

Contributing

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 6