Skip to content

Commit 1167c5f

Browse files
committed
docs: apply suggestions from @chalin
1 parent dceebea commit 1167c5f

File tree

1 file changed

+13
-13
lines changed
  • content/en/blog/2025/observing-lambdas

1 file changed

+13
-13
lines changed

content/en/blog/2025/observing-lambdas/index.md

+13-13
Original file line numberDiff line numberDiff line change
@@ -2,7 +2,7 @@
22
title: Observing Lambdas using the OpenTelemetry Collector Extension Layer
33
author: '[Dominik Süß](https://github.com/theSuess) (Grafana)'
44
linkTitle: Observing Lambdas
5-
date: 2025-01-24
5+
date: 2025-02-03
66
cSpell:ignore: Dominik
77
---
88

@@ -48,11 +48,11 @@ emits a logline, or the execution context is about to be shut down.
4848
### This is where the magic happens
4949

5050
Up until now, this just seems like extra work for nothing. You'll still have to
51-
wait for the Collector to ship the data, right? This is where the special
51+
wait for the Collector to export the data, right? This is where the special
5252
`decouple` processor comes in. It separates the receiving and exporting
5353
components while interfacing with the Lambda lifecycle. This allows for the
5454
Lambda to return, even if not all data has been sent. At the next invocation (or
55-
on shutdown) the Collector continues shipping the data while your function does
55+
on shutdown) the Collector continues exporting the data while your function does
5656
its thing.
5757

5858
{{< figure src="diagram-execution-timing.svg" caption="Diagram showcasing how execution timing differs with and without a Collector">}}
@@ -72,14 +72,10 @@ Lambdas.
7272
The simplest way to get started is with an embedded configuration. For this, add
7373
a file called `collector.yaml` to your function. This is a regular Collector
7474
configuration file. To take advantage of the Lambda specific extensions, they
75-
need to be configured. As an example, the following configuration receives
76-
traces and logs from the Telemetry API and sends them to another endpoint:
75+
need to be configured. As an example, the configuration shown next receives
76+
traces and logs from the Telemetry API and sends them to another endpoint.
7777

7878
```yaml
79-
# The `decouple` processor is configured by default if omitted.
80-
# It is explicitly added in this example to illustrate the entire pipeline
81-
# More information can be found at
82-
# https://github.com/open-telemetry/opentelemetry-lambda/tree/main/collector#auto-configuration
8379

8480
receivers:
8581
telemetryapi:
@@ -101,6 +97,11 @@ service:
10197
exporters: [otlphttp/external]
10298
```
10399
100+
The `decouple` processor is configured by default if omitted. It is explicitly
101+
added in this example to illustrate the entire pipeline. For more information,
102+
see
103+
[Auto-configuration](https://github.com/open-telemetry/opentelemetry-lambda/tree/main/collector#auto-configuration).
104+
104105
Afterward, set the `OPENTELEMETRY_COLLECTOR_CONFIG_URI` environment variable to
105106
`/var/task/collector.yaml`. Once the function is redeployed, you’ll see your
106107
function logs appear! You can see this in action in the video below.
@@ -111,12 +112,11 @@ function logs appear! You can see this in action in the video below.
111112
</video>
112113
</p>
113114

114-
Every log line your Lambda produces will be shipped to the `external-collector`
115+
Every log line your Lambda produces will be sent to the `external-collector`
115116
endpoint specified. You don't need to modify the code at all! From there,
116-
telemetry data flows to your backend as usual. Since the shipping of telemetry
117+
telemetry data flows to your backend as usual. Since the transmission of telemetry
117118
data might be frozen when the lambda is not active, logs can arrive delayed.
118119
They'll either arrive during the next execution or during the shutdown interval.
119120

120-
If you want further insight into your applications, be sure to also check out
121-
the
121+
If you want further insight into your applications, also see the
122122
[language specific auto instrumentation layers](https://github.com/open-telemetry/opentelemetry-lambda/?tab=readme-ov-file#extension-layer-language-support).

0 commit comments

Comments
 (0)