Closed
Description
Describe the bug
When I use AsyncPipeline with Langfuse tracer, the llm output (i.e. the replies
) is not logged to Langfuse
This would work, both input and output of the llm are logged:
tradition_pipeline = Pipeline()
tradition_pipeline.add_component("tracer", LangfuseConnector("haystack-pipeline-testing"))
tradition_pipeline.add_component("llm", OpenAIChatGenerator())
This would won't work, only the input of llm is logged:
async_pipeline = AsyncPipeline()
async_pipeline .add_component("tracer", LangfuseConnector("haystack-pipeline-testing"))
async_pipeline .add_component("llm", OpenAIChatGenerator())
FAQ Check
- Have you had a look at our new FAQ page?