Getting Started
1 minute read
Prerequisites
For this workshop, you will need access to the following:
- AWS Account
- Create / Modify S3 Bucket
- Create / Modify IAM Privileges
- Snowflake account
- Create / Modify Schemas, Tables, Stages and Pipes
Overview
In this workshop, we will be taking signals coming from multiple edge nodes and loading them into Snowflake. We will utilize dynamic S3 location routing and ingesting the data into Snowflake using an external table and snowpipe. You will need access to your own Snowflake account to complete this workshop.
To accomplish this, we will
- Create a new Pipeline
- Configure an endpoint to receive the data (ie, a Source)
- Send data to S3 Destination, utilizing dynamic pathing
- Create external table in Snowflake accessing S3 directly
- Create Snowpipe in Snowflake that automatically loads data into an internal table.
Feedback
Was this page helpful?
Glad to hear it! Please tell us how we can improve.
Sorry to hear that. Please tell us how we can improve.
Last modified March 13, 2023: Merge pull request #58 from logdna/dependabot/npm_and_yarn/autoprefixer-10.4.14 (11f56fc)