Skip to content

Commit dff48b9

Browse files
Document CLI Release v0.9.0 (#56)
* chore: update deps * feat: add AWS components and warehouse category * update registry foundations, add links to AWS components, add WIT note * update manifest documentation, add slug/language fields, rename wit-world-version to wit-version * registry guide: add instructions to add components to a project * remove PR template (it's now centralized) * remove issue templates (now centralized) * update registry FAQs * registry: update CLI output, add new section for publish/unpublish
1 parent 3138fff commit dff48b9

12 files changed

+1181
-247
lines changed

.github/ISSUE_TEMPLATE/bug_report.md

-32
This file was deleted.

.github/ISSUE_TEMPLATE/feature_request.md

-20
This file was deleted.

.github/pull_request_template.md

-13
This file was deleted.
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,48 @@
1+
---
2+
title: Amazon Data Firehose
3+
description: Collect and forward analytics events to your streaming pipelines on Amazon Data Firehose.
4+
---
5+
6+
import EdgeeSdk from '/snippets/edgee-sdk.mdx';
7+
8+
<EdgeeSdk />
9+
10+
Find it on GitHub: [<Icon icon="github" iconType="solid" /> /edgee-cloud/amazon-data-firehose-component](https://github.com/edgee-cloud/amazon-data-firehose-component)
11+
12+
Amazon Data Firehose allows you to reliably ingest, transform, and stream data into data lakes,
13+
warehouses, and analytics services on Amazon Web Services.
14+
15+
## Event Mapping
16+
17+
Here's how Edgee events map to Amplitude events:
18+
19+
| Edgee event | Firehose record | Description |
20+
|-------------|-------------------|----------------|
21+
| Page | `full-event.json` | Full JSON dump |
22+
| Track | `full-event.json` | Full JSON dump |
23+
| User | `full-event.json` | Full JSON dump |
24+
25+
26+
## Getting Started
27+
28+
**To integrate Amazon Data Firehose into your Edgee project:**
29+
30+
1. Open the Edgee console and navigate to your project's Data Collection service.
31+
2. Select "Add Component" and choose "Amazon Data Firehose" from the list of available components.
32+
3. Enter your AWS credentials, region, stream name, and click Save. <br />
33+
5. Once the component has been configured, you are ready to send analysis events to Firehose.
34+
35+
## Component Name
36+
37+
When configuring the component in your **Edgee Data Layer** or within SDK calls, use `edgee/amazon-data-firehose` as the component name:
38+
39+
```json
40+
{
41+
"components": {
42+
"edgee/amazon-data-firehose": true
43+
}
44+
}
45+
```
46+
47+
For more details on Amazon Data Firehose implementation, refer to the
48+
[official Firehose PutRecord documentation](https://docs.aws.amazon.com/firehose/latest/APIReference/API_PutRecord.html).
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,49 @@
1+
---
2+
title: Amazon Kinesis
3+
description: Collect and forward analytics events to your streaming pipelines on Amazon Kinesis.
4+
---
5+
6+
import EdgeeSdk from '/snippets/edgee-sdk.mdx';
7+
8+
<EdgeeSdk />
9+
10+
Find it on GitHub: [<Icon icon="github" iconType="solid" /> /edgee-cloud/amazon-kinesis-component](https://github.com/edgee-cloud/amazon-kinesis-component)
11+
12+
Amazon Kinesis Data Streams allow you to collect and process large streams of data records in real time.
13+
Kinesis is designed for rapid and continuous data intake and aggregation and it lets you ingest data into
14+
other services such as Amazon Redshift or implement real-time processing with AWS Lambda.
15+
16+
## Event Mapping
17+
18+
Here's how Edgee events map to Amplitude events:
19+
20+
| Edgee event | Kinesis record | Description |
21+
|-------------|-------------------|----------------|
22+
| Page | `full-event.json` | Full JSON dump |
23+
| Track | `full-event.json` | Full JSON dump |
24+
| User | `full-event.json` | Full JSON dump |
25+
26+
27+
## Getting Started
28+
29+
**To integrate Amazon Kinesis into your Edgee project:**
30+
31+
1. Open the Edgee console and navigate to your project's Data Collection service.
32+
2. Select "Add Component" and choose "Amazon Kinesis" from the list of available components.
33+
3. Enter your AWS credentials, region, stream name/arn, optional partition, and click Save. <br />
34+
5. Once the component has been configured, you are ready to send analysis events to Kinesis.
35+
36+
## Component Name
37+
38+
When configuring the component in your **Edgee Data Layer** or within SDK calls, use `edgee/amazon-kinesis` as the component name:
39+
40+
```json
41+
{
42+
"components": {
43+
"edgee/amazon-kinesis": true
44+
}
45+
}
46+
```
47+
48+
For more details on Amazon Kinesis implementation, refer to the
49+
[official Kinesis PutRecord documentation](https://docs.aws.amazon.com/kinesis/latest/APIReference/API_PutRecord.html).
+49
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,49 @@
1+
---
2+
title: Amazon S3
3+
description: Collect and forward analytics events to your data lake on Amazon S3.
4+
---
5+
6+
import EdgeeSdk from '/snippets/edgee-sdk.mdx';
7+
8+
<EdgeeSdk />
9+
10+
Find it on GitHub: [<Icon icon="github" iconType="solid" /> /edgee-cloud/amazon-s3-component](https://github.com/edgee-cloud/amazon-s3-component)
11+
12+
Amazon Simple Storage Service (Amazon S3) is an object storage service offered by Amazon Web Services.
13+
It supports storing any amount of data for virtually any use case, such as data lakes, cloud-native applications,
14+
and mobile apps.
15+
16+
## Event Mapping
17+
18+
Here's how Edgee events map to Amplitude events:
19+
20+
| Edgee event | S3 object | Description |
21+
|-------------|--------------------------------------|----------------|
22+
| Page | `{bucket}/{prefix}{random-key}.json` | Full JSON dump |
23+
| Track | `{bucket}/{prefix}{random-key}.json` | Full JSON dump |
24+
| User | `{bucket}/{prefix}{random-key}.json` | Full JSON dump |
25+
26+
27+
## Getting Started
28+
29+
**To integrate Amazon S3 into your Edgee project:**
30+
31+
1. Open the Edgee console and navigate to your project's Data Collection service.
32+
2. Select "Add Component" and choose "Amazon S3" from the list of available components.
33+
3. Enter your AWS credentials, region, bucket name, optional prefix, and click Save. <br />
34+
5. Once the component has been configured, you are ready to send analysis events to S3.
35+
36+
## Component Name
37+
38+
When configuring the component in your **Edgee Data Layer** or within SDK calls, use `edgee/amazon-s3` as the component name:
39+
40+
```json
41+
{
42+
"components": {
43+
"edgee/amazon-s3": true
44+
}
45+
}
46+
```
47+
48+
For more details on Amazon S3 implementation, refer to the
49+
[official S3 PutObject documentation](https://docs.aws.amazon.com/AmazonS3/latest/API/API_PutObject.html).

docs.json

+9
Original file line numberDiff line numberDiff line change
@@ -145,6 +145,15 @@
145145
"components/data-collection/snapchat-capi",
146146
"components/data-collection/pinterest-capi"
147147
]
148+
},
149+
{
150+
"group": "Warehouse",
151+
"icon": "database",
152+
"pages": [
153+
"components/data-collection/amazon-s3",
154+
"components/data-collection/amazon-kinesis",
155+
"components/data-collection/amazon-data-firehose"
156+
]
148157
}
149158
]
150159
},

0 commit comments

Comments
 (0)