We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Describe the bug When I use AsyncPipeline with Langfuse tracer, the llm output (i.e. the replies) is not logged to Langfuse
replies
This would work, both input and output of the llm are logged:
tradition_pipeline = Pipeline() tradition_pipeline.add_component("tracer", LangfuseConnector("haystack-pipeline-testing")) tradition_pipeline.add_component("llm", OpenAIChatGenerator())
This would won't work, only the input of llm is logged:
async_pipeline = AsyncPipeline() async_pipeline .add_component("tracer", LangfuseConnector("haystack-pipeline-testing")) async_pipeline .add_component("llm", OpenAIChatGenerator())
FAQ Check
The text was updated successfully, but these errors were encountered:
@sjrl Thank you for the fix. That's awesome.
Sorry, something went wrong.
sjrl
Successfully merging a pull request may close this issue.
Describe the bug
When I use AsyncPipeline with Langfuse tracer, the llm output (i.e. the
replies
) is not logged to LangfuseThis would work, both input and output of the llm are logged:
This would won't work, only the input of llm is logged:
FAQ Check
The text was updated successfully, but these errors were encountered: