Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

telemetry configuration causes crash at startup #12439

Closed
codeboten opened this issue Feb 20, 2025 · 2 comments · Fixed by #12438
Closed

telemetry configuration causes crash at startup #12439

codeboten opened this issue Feb 20, 2025 · 2 comments · Fixed by #12438

Comments

@codeboten
Copy link
Contributor

As noted by @skandragon in #12332, there is a bug in v0.119.0 and v0.120.0 that causes the following panic at start when more than one headers is configured:

panic: runtime error: invalid memory address or nil pointer dereference
[signal SIGSEGV: segmentation violation code=0x1 addr=0x8 pc=0xda8c5a]
goroutine 1 [running]:
go.opentelemetry.io/contrib/config/v0%2e3%2e0.toStringMap(...)
	go.opentelemetry.io/contrib/[email protected]/v0.3.0/config.go:157
go.opentelemetry.io/contrib/config/v0%2e3%2e0.otlpGRPCMetricExporter({0x75e6e88, 0xb3cce60}, 0xc0005aa9a0)
	go.opentelemetry.io/contrib/[email protected]/v0.3.0/metric.go:233 +0x61a
go.opentelemetry.io/contrib/config/v0%2e3%2e0.periodicExporter({0x75e6e88?, 0xb3cce60?}, {0x0?, 0xc0005aa9a0?, {0x0?, 0x0?}}, {0xc001421390, 0x1, 0x1})
	go.opentelemetry.io/contrib/[email protected]/v0.3.0/metric.go:127 +0x1e6
go.opentelemetry.io/contrib/config/v0%2e3%2e0.metricReader({0x75e6e88, 0xb3cce60}, {0xc000e515c0, {0x0, 0x0, 0x0}, 0x0})
	go.opentelemetry.io/contrib/[email protected]/v0.3.0/metric.go:88 +0x23a
go.opentelemetry.io/contrib/config/v0%2e3%2e0.meterProvider({{0x75e6e88, 0xb3cce60}, {0x0, 0x0, 0x0, 0x0, 0xc0014252e0, 0xc0015098c0, 0x0, 0xc001509920, ...}}, ...)
	go.opentelemetry.io/contrib/[email protected]/v0.3.0/metric.go:50 +0x1bf
go.opentelemetry.io/contrib/config/v0%2e3%2e0.NewSDK({0xc00128f028?, 0x0?, 0x0?})
	go.opentelemetry.io/contrib/[email protected]/v0.3.0/config.go:90 +0x1fe
go.opentelemetry.io/collector/service.New({_, _}, {{{0x6c1c2ff, 0x19}, {0x0, 0x0}, {0x0, 0x0}}, 0xc000ed7eb0, 0xc0009df9e0, ...}, ...)
	go.opentelemetry.io/collector/[email protected]/service.go:140 +0x95c
go.opentelemetry.io/collector/otelcol.(*Collector).setupConfigurationComponents(0xc001086c60, {0x75e6e88, 0xb3cce60})
	go.opentelemetry.io/collector/[email protected]/collector.go:186 +0x588
go.opentelemetry.io/collector/otelcol.(*Collector).Run(0xc001086c60, {0x75e6e88, 0xb3cce60})
	go.opentelemetry.io/collector/[email protected]/collector.go:285 +0x55
go.opentelemetry.io/collector/otelcol.NewCommand.func1(0xc00080db08, {0x685a762?, 0x4?, 0x685a6da?})
	go.opentelemetry.io/collector/[email protected]/command.go:36 +0x94
github.com/spf13/cobra.(*Command).execute(0xc00080db08, {0xc000072050, 0x1, 0x1})
	github.com/spf13/[email protected]/command.go:985 +0xaaa
github.com/spf13/cobra.(*Command).ExecuteC(0xc00080db08)
	github.com/spf13/[email protected]/command.go:1117 +0x3ff
github.com/spf13/cobra.(*Command).Execute(0x6dfcdc8?)
	github.com/spf13/[email protected]/command.go:1041 +0x13
main.runInteractive({0x6dfcdc8, {{0x6c1c2ff, 0x19}, {0x0, 0x0}, {0x0, 0x0}}, 0x0, {{{0x0, 0x0, ...}, ...}}, ...})
	go.opentelemetry.io/collector/cmd/builder/main.go:55 +0x45
main.run(...)
	go.opentelemetry.io/collector/cmd/builder/main_others.go:10
main.main()
	go.opentelemetry.io/collector/cmd/builder/main.go:48 +0x365

The case here was using "headers" where it was a map[string]string rather than a list of objects with {name, value}

Originally posted by @skandragon in #12332

@codeboten
Copy link
Contributor Author

It took me some time to reproduce the issue, the following config causes can reproduce the problem:

service:
  telemetry:
        logs:
          processors:
          - batch:
              exporter:
                otlp:
                  endpoint: collector:4318
                  headers:
                    key1: value1
                    key2: value2
                    key3: value3
                  protocol: http/protobuf

@codeboten
Copy link
Contributor Author

A workaround if to use the following config:

service:
  telemetry:
        logs:
          processors:
          - batch:
              exporter:
                otlp:
                  endpoint: collector:4318
                  headers:
                    - name: key1
                      value: value1
                    - name: key2
                      value: value2
                    - name: key3
                      value: value3
                  protocol: http/protobuf

github-merge-queue bot pushed a commit that referenced this issue Feb 20, 2025
Converting headers from config schema v0.2.0 to v0.3.0 was causing a nil
dereferencing issue by incorrectly setting the name/value pair to a nil
pointer. Added a test in both the loading of the config in otelcol, as
well as the migration code unit test.

Fixes #12439

---------

Signed-off-by: Alex Boten <[email protected]>
Co-authored-by: Bogdan Drutu <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

1 participant