Skip to content

LLM output not logged to Langfuse tracer when using AsyncPipeline #9139

Closed
@henryclw

Description

@henryclw

Describe the bug
When I use AsyncPipeline with Langfuse tracer, the llm output (i.e. the replies) is not logged to Langfuse

This would work, both input and output of the llm are logged:

    tradition_pipeline = Pipeline()
    tradition_pipeline.add_component("tracer", LangfuseConnector("haystack-pipeline-testing"))
    tradition_pipeline.add_component("llm", OpenAIChatGenerator())

This would won't work, only the input of llm is logged:

    async_pipeline = AsyncPipeline()
    async_pipeline .add_component("tracer", LangfuseConnector("haystack-pipeline-testing"))
    async_pipeline .add_component("llm", OpenAIChatGenerator())

FAQ Check

Metadata

Metadata

Assignees

Labels

P1High priority, add to the next sprinttype:bugSomething isn't working

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions