Add function_call OpenTelemetry spans for LLM service tool execution#4266
Open
itsderek23 wants to merge 5 commits into
Open
Add function_call OpenTelemetry spans for LLM service tool execution#4266itsderek23 wants to merge 5 commits into
itsderek23 wants to merge 5 commits into
Conversation
Function calls via LLMService._run_function_call() were invisible in traces. This adds a "function_call" span as a child of the turn span with tool.function_name, tool.call_id, tool.arguments, tool.result, and tool.result_status attributes, matching Gemini Live conventions. Note: for async/deferred function calls (cancel_on_interruption=False), the span ends when _run_function_call() returns, not when the final result callback fires. This means span duration may undercount for deferred calls. Accurate async span timing is left for a future change.
Tests for LLMService._run_function_call() tracing don't belong in test_turn_trace_observer.py. Move them to test_function_call_tracing.py to match the existing convention of one test file per tested component.
Move the duplicated _InMemorySpanExporter class from both test files into pipecat.tests.utils, following the existing convention for shared test utilities.
Move span lifecycle management from inline code in llm_service.py to a traced_function_call decorator, matching the pattern of traced_llm, traced_tts, and traced_stt. Only the tool.result capture remains inline in the callback since the decorator cannot access the result.
Codecov Report❌ Patch coverage is
... and 1 file with indirect coverage changes 🚀 New features to boost your workflow:
|
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
#3885 added function call duration metrics to LatencyBreakdown. This PR makes the same data visible via OTEL
function_callspans. These spans are child of the turn span, with the following attributes:Implementation notes
I've tried to match existing instrumentation patterns, using a new
@traced_function_calldecorator inservice_decorators.py. One deviation:tool.resultandtool.result_statusare set inline in the result callback withinllm_service.py(4 lines) because I don't believe the decorator can access it.Deferred work
For async/deferred (pun-intended) function calls, the span ends when _run_function_call() returns, not when the final result callback fires. async span timing is left for a future change.
Verification
This example simulates 200ms function calls. I've verified the values in the spans match.
Full trace output