Skip to content
Open
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
14 changes: 7 additions & 7 deletions masterdoc/03.md
Original file line number Diff line number Diff line change
Expand Up @@ -171,9 +171,9 @@ And, this will take you to a webpage that will confirm **Data Simulation** has s
#### Task 1.2: Explore a few Synapse pipelines that ingest raw data from analytical data sources to the Bronze layer of the Data Lake. <a name="analytical-sources"></a>


Your next challenge is to ingest historical data from a spectrum of data sources.
In your next challenge, you will ingest historical data from various data sources.

In this task you will ingest campaigns data from Snowflake and customer churn data from Teradata into the data lake.
In this task, you will ingest campaign data from Snowflake and customer churn data from Teradata into the data lake.

1. Return to the Synapse Studio web session (tab).

Expand All @@ -185,7 +185,7 @@ In this task you will ingest campaigns data from Snowflake and customer churn da

5. Expand the **Landing Analytical Store Data** folder.

6. Select the **Campaigns Data From Snowflake** pipeline.
6. Select the **Campaigns Data from Snowflake** pipeline.

>**Note:** If required, collapse the panes on the left using the << icon at the top right of each pane.

Expand All @@ -203,7 +203,7 @@ In this task you will ingest campaigns data from Snowflake and customer churn da

![Source Dataset](https://github.com/CloudLabsAI-Azure/Ignite-lab/blob/main/media/image1209.png?raw=true)

10. In the pipeline designer, select **Copy data** activity.
10. In the pipeline designer, select the **Copy data** activity.

11. In the pane below, select the **Sink** tab.

Expand All @@ -224,9 +224,9 @@ In this task you will ingest campaigns data from Snowflake and customer churn da
#### Task 1.3: Explore a few Synapse pipelines that ingest raw data from operational data sources to the Bronze layer of the Data Lake. <a name="operational-sources"></a>


In this task, you will explore the design of a Synapse pipeline that is designed to ingest raw data coming from various operational sources into the data lake.
In this task, you will explore the design of a Synapse pipeline. This pipeline is designed to ingest raw data from various operational sources into the data lake.

1. In the **Integrate** pane, expand the **Landing Operational Store Data** folder, select the **Store Transactions Data from SQL DB** pipeline.
1. In the **Integrate** pane, expand the **Landing Operational Store Data** folder, and select the **Store Transactions Data from SQL DB** pipeline.

![Landing Operational Store Data](https://github.com/CloudLabsAI-Azure/Ignite-lab/blob/main/media/image1309.png?raw=true)

Expand All @@ -249,7 +249,7 @@ In this task, you will explore the design of a Synapse pipeline that is designed
![Sales Data](https://github.com/CloudLabsAI-Azure/Ignite-lab/blob/main/media/image1302.png?raw=true)


Congratulations! As a data engineer you have successfully ingested streaming near real-time as well as historical data into the data lake for Wide World importers.
Congratulations! As a data engineer you have successfully ingested streaming near real-time and historical data into the data lake for Wide World importers.

----

Expand Down