-
Notifications
You must be signed in to change notification settings - Fork 64
Open
Description
Hi,
I have the following issue:
Expected behavior:
When I consume data from a Kafka topic in batch-mode, I expect that a new span for the consumer is added to the existing producer span as "receive" span for each poll.
Observed behavior:
The trace context is lost, and for each poll a new "send" span is created
Workaround:
Adding the TracingConsumerInterceptor
explicitly under interceptor.classes
for the batch consumer. See the following example application.yml
spring:
application:
name: 'my-app'
cloud:
function:
definition: processBatchData;processData
stream:
function:
batch-mode: true
bindings:
processBatchData-in-0:
destination: my.batch.data
group: ${spring.application.name}-consumer
consumer:
batch-mode: true
processData-in-0:
destination: my.data
group: ${spring.application.name}-consumer
kafka:
bindings:
processBatchData-in-0:
consumer:
configuration:
key.deserializer: org.apache.kafka.common.serialization.StringDeserializer
value.deserializer: io.confluent.kafka.serializers.KafkaAvroDeserializer
max.poll.records: 50
fetch.max.wait.ms: 1000
specific.avro.reader: true
interceptor.classes: io.opentracing.contrib.kafka.TracingConsumerInterceptor
processData-in-0:
consumer:
configuration:
key.deserializer: org.apache.kafka.common.serialization.StringDeserializer
value.deserializer: io.confluent.kafka.serializers.KafkaAvroDeserializer
specific.avro.reader: true
After adding the property, the 'processBatchData-in-0' span shows up as receive span in Jaeger as part of the expected trace, but the send span is still created.
Used Dependencies:
- org.springframework.boot:2.5.4
- org.springframework.kafka:spring-kafka:2.7.6
- org.springframework.cloud:spring-cloud-stream:2020.0.3
- org.springframework.cloud:spring-cloud-stream-binder-kafka:2020.0.3
- io.opentracing.contrib:opentracing-spring-jaeger-cloud-starter:3.3.1
Metadata
Metadata
Assignees
Labels
No labels