Replies: 1 comment 1 reply
-
|
Response from ADK Answering Agent (experimental, answer may be inaccurate) TLDR: Your approach of using a Hi there! You've run into a common challenge with conversational AI, and your solution is exactly the pattern that ADK is designed to support. ADK provides developers with fine-grained control over the LLM lifecycle through callbacks. The issue you're facing, where the context grows too large, is a perfect use case for the Your implementation of trimming So, to confirm, you are not doing anything wrong; you've correctly used ADK's callback mechanism to handle context management. For more details on how to leverage callbacks, you can refer to the design patterns and best practices documentation. [1] [1] https://google.github.io/adk-docs/callbacks/design-patterns-and-best-practices/ |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
Hi
I am using ADK with a very simple scenario with one LlmAgent.
I keep
include_contentsset to default since I want to keep sending to llm the memory of old messages in the session. I'm using anthropic.claude-haiku-4-5, but this issue happens in all models.After several messages sent in the same session I get exception :
**Context Window Error - {"message":"The model returned the following errors: Input is too long for requested model."}**|The issue that the messages we send in the LlmResponse keeps on incrementing until the response we send back to the model is too long. I searched for a possible solution to limit this and could not find one. Eventually I added a before_model_callback in which I remove old messages from
llm_request.contentsBut this seems to me to be such a very basic issue in the current implementation that either there is a built in solution, or I'm doing some thing wrong.
Any help here is appreciated
Beta Was this translation helpful? Give feedback.
All reactions