Replies: 1 comment 1 reply
-
Hi @vslepakov, thanks for bringing this up. This is a gap right now with using auto invocation filters with agents that are not the chat completion agent. I created an issue out of this to track: #11054. |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hi folks,
I have a plugin that I am using with the new Azure AI Agent (Agent) in semantic kernel (auto function invocation). One of the functions returns a string that should be displayed 100% as is without any modifications by the LLM (prompt engineering is not an option for various reasons).
I saw that something like this should be possible using a filter
I tried adding such a filter like so:
ai_agent.kernel.add_filter( FilterTypes.AUTO_FUNCTION_INVOCATION, auto_function_invocation_filter )
and the filter gets invoked. I call
context.terminate = True
when after this specific function was called but the string still gets modified by the LLM.Is there a way to address this behavior?
Thanks!
Beta Was this translation helpful? Give feedback.
All reactions