Create the Pipeline
3 minute read
Make sure you have a Mezmo account
Step 1: Create a new Pipeline
Once you have Pipeline enabled, go ahead and Create a new Pipeline. You will be prompted to name your Pipeline, call it what you will but we will go with Snowflake-Integration
. After creation, you will be taken to the following blank canvas
This parts easy. Go to the pipeline you created previously and click Add Source From there, just select Now, we need to create an access key corresponding to the new Go ahead and click the A new key will be generated for you to use and is displayed on the source details, as well as the HTTP address to send the data to. Be sure to copy this Access Key somewhere safe for later reference as you will not be able to view it again once the source node is updated. Click You now have an endpoint defined that can receive any data. If you run into trouble here, please checkout out our comprehensive Mezmo Platform workshop to learn how to utilize the sytem in depth. Next we will add S3 as a destination. You will need some information from you AWS account here. Specifically, you will need the following you created during step 2 of the previous section. With access information in hand Your destination should look similar to the image below. Note: Make sure messages going to this S3 destination contain the dynamic field as part of the path. Any events that do not have the fields will not go to S3. Now let’s connect the Now, simply Deploy your Pipeline. Was this page helpful? Glad to hear it! Please tell us how we can improve. Sorry to hear that. Please tell us how we can improve.Step 2: Add the Source
HTTP
, give it a Title like Edge Devices
, set Decoding Method to JSON
and click Save.HTTP
source. Click on the HTTP
source to bring up the following panel.Create new key
button in the Access Key Management
section. Here you can give the new access key a name of Edge Device Key
and click the Create
button.Update
to save your changes.Step 3: Add the S3 Destination
AWS Access Key ID
AWS Secret Access Key
AWS Bucket name
AWS Region
AWS S3
Snowflake Bucket
Access Key ID
and Secret Access Key
Bucket
name (we will go with mezmo-use1-snowflake-demo
)device-sim/event_date=%F/event_name={{ .event }}/
. This prefix allows for dynamic location routing and will store data including date and event name coming from the event field. Example device-sim/event_date=2022-11-09/event_name=transaction/
text
for the Encoding
, with a compression of gzip
Region
where you created your bucket (in this example we use us-east-1
)save
buttonEdge Device
Source to the Snowflake Bucket
Destination. For this example, we are not going to use any processors.Feedback