Skip to content

Latest commit

 

History

History
212 lines (170 loc) · 8.87 KB

inter-service-communication.adoc

File metadata and controls

212 lines (170 loc) · 8.87 KB

Inter-Service Communication

Now that our To Do list is feature-complete we want to add some new functionality. We will add a new use case which adds a task using activity suggestions using the locally running GenAI Ollama.

Decide if you implement the logic or the tests first.

Add endpoint and use case for requesting a random activity

Implement the endpoint /task/list/{id}/random-activity in our service as a POST request. When it is called a new task should be added to our To Do list by calling a new use case. Implement the new use case with a hardcoded response.

You can test this now with the Swagger UI. After calling the endpoint for a given task list id you should see a new entry when reloading the To Do list.

See the code snippets below for suggestions on how to implement the new REST API and use case.

Interaction with Ollama

Verify that the Ollama container which you started with docker-compose up is still running. Connect to its terminal and verify with the following commands that everything is running fine:

  • ollama list → shows a list of available models (i.e. llama3.2:latest)

  • ollama ps → shows the currently running models (i.e. llama3.2:latest)

In case the model is not running anymore, you can start it by

ollama run llama3.2 --keepalive 30m

Type \bye to exit the interactive mode, the model will remain running.

From outside the container you can send a request to Ollama with curl and let it generate a response for your prompt, e.g.

curl -X POST http://localhost:11434/api/generate -d '{
	"model": "llama3.2",
	"prompt":"Give me a random item (containing maximal 5-6 words) which I can add to my To-Do list.",
	"stream":false,
	"keep_alive":1200,
}'

If everything is running correctly, you will receive an answer from the AI after a short time.

Generate and use REST client

In src/main/openapi/ollama_api.yaml you find the OpenAPI specification (OAS) of the /api/generate endpoint of Ollama. With the Quarkus OpenAPI generator it is possible to generate a REST client and corresponding classes from it during the Maven build process.

Build the backend project with Maven and inspect and familiarize yourself with the generated classes in target/generated-sources/open-api-yaml/org/openapi/quarkus/ollama_api_yaml.

Change the use case now to call the Ollama endpoint /api/generate using the generated client to get a random activity instead of returning the hardcoded response and check it again with Swagger UI. Make sure that the AI model is running inside the Ollama container, otherwise the requests will time out. Do you get a new random activity each time you call the new endpoint?

See the code snippets below for suggestions on how to implement the REST request to Ollama.

Add Logging

The communication with the external service in the REST client can be logged automatically for easier insight during development. Just add the following lines to your application.properties:

%test.quarkus.rest-client.logging.scope=request-response
%dev.quarkus.rest-client.logging.scope=request-response
quarkus.log.category."org.jboss.resteasy.reactive.client.logging".level=DEBUG

Add tests

Implement two types of tests for the use case:

  1. Add a test for the use case which mocks the REST client and checks the use case logic. You may want to use InjectMock.

  2. Add a test which tests the integration with the REST API. For this stub the Ollama API with WireMock on network level. The communication should be transparently handled by WireMock and the test must not talk to the real service. Refer to the relevant Quarkus guide linked below.

    Note: The port for WireMock needs to be set to 11343 since this is the port which is otherwise used by the Ollama container and the generated REST client.

Code Snippets (hints)

TaskService.java (REST layer)
// snippet
@POST
@Path("/list/{id}/random-activity")
@Operation(summary = "Add random activity", description = "Add a random activity to this task list")
@APIResponse(responseCode = "201", description = "Task item successfully added")
 @APIResponse(responseCode = "500", description = "Server unavailable or a server-side error occurred")
public Response addRandomActivity(
    @Parameter(description = "The id of the task list for which to add the task", required = true, example = "1", schema = @Schema(type = SchemaType.INTEGER)) @PathParam("id") Long id) {

  Long taskItemId = this.ucAddRandomActivityTask.addRandom(id);
  return Response.created(URI.create("/task/item/" + taskItemId)).build();
}
UcAddRandomActivityTaskItem.java (Use Case)
/**
 * Use-Case to add one or multiple {@link org.example.app.task.common.TaskItem task items} with a random activity that
 * is generated by a locally running Ollama.
 *
 * @see <a href="https://ollama.com/">Ollama LLM</a>
 */
@ApplicationScoped
@Named
@Transactional
public class UcAddRandomActivityTaskItem {

  // add your injections/dependencies here

  /**
   * @param taskListId id the {@link org.example.app.task.dataaccess.TaskListEntity#getId() primary key} of the
   *        {@link org.example.app.task.dataaccess.TaskListEntity} for which to add a random task.
   * @return the {@link TaskItemEntity#getId() primary key} of the newly added {@link TaskItemEntity}.
   */
  public Long addRandom(Long taskListId) {

    // add your implementation here
  }

}
REST request to Ollama
// snippet
public String getRandomActivity() {
    QueryLlmRequest request = new QueryLlmRequest();
    request.setModel("llama3.2");
    request.setStream(false);
    request.setKeepAlive(1200);
    request.setPrompt("Give me exactly one random item (containing maximal 5-6 words) which I can add to my ToDo list and return only this item without any additional text.");

    return defaultApi.queryLlm(request).getResponse();
}

Optional: Add further functionality

When you finished implementing the functionality above and still have time, you can continue to add further functionality to the To-Do app using Ollama.

Adding multiple items at once

Instead of letting Olama only generate one random item, we want to let it generate multiple items (e.g. 5-10) at once which are all related to one topic that is specified by the list title. The response should be returned as structure data using JSON and saved to a newly created task list.

  • implement a new endpoint in our service as a POST request

    • create a new task list with the provided list title

    • generate multiple list items and add them to the new list by calling the use case

  • implement a new method in the use case class to take care of generating multiple items

    • create a new request to Ollama with a corresponding prompt

    • specify the following JSON schema as format in the Ollama request

      {
        "type": "array",
        "items": {
          "type": "object",
          "properties": {
            "title": {
              "type": "string"
            }
          },
          "required": [
            "title"
          ]
        }
      }
    • parse the Ollama JSON response into a list of task items using Jackson ObjectMapper

  • test with the Swagger UI that the new endpoint functions as intended

  • extend the tests which you have created previously

  • See the code snippets below for suggestions on how to implement the feature.

Extracting ingredients from a recipe

For this functionality the idea is to provide a recipe to Ollama and let it extract all required ingredients. Again, Ollama should respond with structured data in JSON format and the items should be added to a new task list.

Follow similar steps for implementing this feature as for the previous functionality. Think about how to best provide list title and recipe as input for the new REST endpoint.

Code Snippets (hints)

Parse JSON schema from file
// snippet
if (schemaPath != null) {
  try (InputStream schemaStream = getClass().getClassLoader().getResourceAsStream(schemaPath)) {
    if (schemaStream == null) {
      throw new FileNotFoundException("Schema file not found: " + schemaPath);
    }
    Map<String, Object> schemaMap = new ObjectMapper().readValue(schemaStream, new TypeReference<>() {});
    // set format for Ollama request
  } catch (IOException e) {
    throw new RuntimeException("Error loading schema from " + schemaPath, e);
  }
}
Parse Ollama JSON response to task items
// snippet
QueryLlm200Response response = defaultApi.queryLlm(request);
try {
  return new ObjectMapper().readValue(response.getResponse(), typeReference);
} catch (JsonProcessingException e) {
  throw new RuntimeException("Error during parsing JSON response from Ollama.", e);
}