Conversation
203a29d to
b079e98
Compare
| go 1.23.3 | ||
|
|
||
| toolchain go1.23.4 |
There was a problem hiding this comment.
@RonFed fyi this happened switching to the vitess library:
$ make go-mod-tidy
go: finding module for package vitess.io/vitess/go/vt/sqlparser
go: downloading vitess.io/vitess v0.21.1
go: toolchain upgrade needed to resolve vitess.io/vitess/go/vt/sqlparser
go: vitess.io/vitess@v0.21.1 requires go >= 1.23.3; switching to go1.23.4
go: downloading go1.23.4 (darwin/arm64)
go: finding module for package vitess.io/vitess/go/vt/sqlparser
go: found vitess.io/vitess/go/vt/sqlparser in vitess.io/vitess v0.21.1
There was a problem hiding this comment.
I think we can remove the toolchain line, run go mod tidy again and it won't require the toolchain part but I'm not sure.
RonFed
left a comment
There was a problem hiding this comment.
Left some small comments.
I think we should do a load test with DB spans to make sure the CPU overhead of this processor is not too large.
|
|
||
| // If we have a new span name, use that (but only if the current span name is | ||
| // our default "DB" auto-instrumentation name). | ||
| if spanName != "" && span.Name() == "DB" { |
There was a problem hiding this comment.
The DB name is specific to go auto instrumentation.
This creates some coupling. However, I don't have a suggestion for a better solution here since overriding the span name might not be desirable in a lot of cases.
There was a problem hiding this comment.
exactly my thought, I'm assuming the other languages are able to follow the span name semconv easier? It would be nice if our default name from auto-instrumentation was something more indicative
fc672de to
5241393
Compare
|
Mike can u check the how it is effect the cpu on the gateway if exists? we can do it together of course |
|
Seems like maybe there is a goroutine leak in the vitess library? Based on this failure that only showed up once I switched to it. Or I'm not closing something properly with that library. I'll try to resolve that, then do the performance checks that @tamirdavid1 mentioned |
Here to help whatever needed |
|
@tamirdavid1 @RonFed the memory leak looked like it was from glog, so I decided to look at #2038 to see if it could have just been a bug in the otel version pulled in by updating to vitess. But, the more I think about it, I don't think we should use vitess. It's been problematic to get to work for several reasons:
I reverted the change to vitess, and I'm working on getting some profiling data now. @tamirdavid1 would you be able to get the profiling data to compare this branch now? |
|
This PR is stale because it has been open 60 days with no activity. Remove stale label or comment or this will be closed in 30 days. |
|
This PR was closed because it has been stale for 30 days with no activity. |
Adds https://github.com/odigos-io/enterprise-go-instrumentation/pull/896 to the
odigossqldboperationprocessor. This gives us coverage for all languages and shifts the parsing out of the agentdb.collection.namewhen available{operation} {target}if both are available, or just{operation}if not (https://opentelemetry.io/docs/specs/semconv/database/database-spans/#name)DB(so we are only modifying span names we created)