Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

LLM output not logged to Langfuse tracer when using AsyncPipeline #9139

Closed
1 task done
henryclw opened this issue Mar 30, 2025 · 1 comment · Fixed by #9163
Closed
1 task done

LLM output not logged to Langfuse tracer when using AsyncPipeline #9139

henryclw opened this issue Mar 30, 2025 · 1 comment · Fixed by #9163
Assignees
Labels
P1 High priority, add to the next sprint type:bug Something isn't working

Comments

@henryclw
Copy link

Describe the bug
When I use AsyncPipeline with Langfuse tracer, the llm output (i.e. the replies) is not logged to Langfuse

This would work, both input and output of the llm are logged:

    tradition_pipeline = Pipeline()
    tradition_pipeline.add_component("tracer", LangfuseConnector("haystack-pipeline-testing"))
    tradition_pipeline.add_component("llm", OpenAIChatGenerator())

This would won't work, only the input of llm is logged:

    async_pipeline = AsyncPipeline()
    async_pipeline .add_component("tracer", LangfuseConnector("haystack-pipeline-testing"))
    async_pipeline .add_component("llm", OpenAIChatGenerator())

FAQ Check

@julian-risch julian-risch added the P1 High priority, add to the next sprint label Mar 31, 2025
@sjrl sjrl self-assigned this Apr 1, 2025
@sjrl sjrl closed this as completed in #9163 Apr 3, 2025
@henryclw
Copy link
Author

henryclw commented Apr 3, 2025

@sjrl Thank you for the fix. That's awesome.

@sjrl sjrl added the type:bug Something isn't working label Apr 9, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
P1 High priority, add to the next sprint type:bug Something isn't working
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants