Skip to content

Commit da87838

Browse files
Fix typo (#16)
* Update README.md * Update README.md * change LangChain4J to LangChain4j * typo * Update README.md --------- Co-authored-by: Bruno Borges <[email protected]>
1 parent 480ccc6 commit da87838

File tree

2 files changed

+3
-3
lines changed

2 files changed

+3
-3
lines changed

README.md

+1-1
Original file line numberDiff line numberDiff line change
@@ -37,7 +37,7 @@ description: A Java sample app that chats with your data using OpenAI.
3737

3838
</div>
3939

40-
This sample shows how to build an AI chat experience with Retrieval-Augmented Generation (RAG) using LangChain4J and OpenAI language models. The application is hosted on [Azure Static Web Apps](https://learn.microsoft.com/azure/static-web-apps/overview) and [Azure Container Apps](https://learn.microsoft.com/azure/container-apps/overview), with [Qdrant](https://qdrant.tech/) as the vector database. You can use it as a starting point for building more complex AI applications.
40+
This sample shows how to build an AI chat experience with Retrieval-Augmented Generation (RAG) using LangChain4j and OpenAI language models. The application is hosted on [Azure Static Web Apps](https://learn.microsoft.com/azure/static-web-apps/overview) and [Azure Container Apps](https://learn.microsoft.com/azure/container-apps/overview), with [Qdrant](https://qdrant.tech/) as the vector database. You can use it as a starting point for building more complex AI applications.
4141

4242
> [!IMPORTANT]
4343
> 👉 **Follow the [full-length workshop](https://aka.ms/ws/openai-rag-quarkus)** to learn how we built this sample and how you can run and deploy it.

docs/sections/06-chat-api.md

+2-2
Original file line numberDiff line numberDiff line change
@@ -14,7 +14,7 @@ We're going to use [Quarkus' Context And Dependency Injection (CDI) mechanism](h
1414
- The `ai.azure.openai.rag.workshop.backend.configuration.EmbeddingModelProducer` will be responsible for configuring the embedding model.
1515
- The `ai.azure.openai.rag.workshop.backend.configuration.EmbeddingStoreProducer` will be responsible for configuring the Qdrant embedding store.
1616

17-
As those producers are configured in separate files, and use the LangChain4J API, they can later be switched easily to use other implementations: this will be useful for example to use a more powerful language or embedding model, or for running tests locally.
17+
As those producers are configured in separate files, and use the LangChain4j API, they can later be switched easily to use other implementations: this will be useful for example to use a more powerful language or embedding model, or for running tests locally.
1818

1919
Let's start by configuring `ChatLanguageModelAzureOpenAiProducer`, using the Azure OpenAI API.
2020

@@ -167,7 +167,7 @@ And let's finish with configuring the `EmbeddingStoreProducer`, using the Qdrant
167167
168168
### Creating the Chat API
169169
170-
Now that our data has been ingested, and that our services are configured in Quarkus, it's time to interact with our vector database and an LLM using LangChain4J.
170+
Now that our data has been ingested, and that our services are configured in Quarkus, it's time to interact with our vector database and an LLM using LangChain4j.
171171
172172
![ChatResource and dependencies](./assets/class-diagram-rest.png)
173173

0 commit comments

Comments
 (0)