Overview
Snowflake is a data warehouse built for the cloud. Read more about Snowflake. Etlworks includes several pre-built Flows optimized for Snowflake.
Flows optimized for Snowflake
Flow type | When to use |
|
When you need to extract data from any source, transform it and load it into Snowflake. |
Bulk load files into Snowflake | When you need to bulk-load files that already exist in the external Snowflake stage (S3, Azure Blob, GC blob) or in the server storage without applying any transformations. The flow automatically generates the COPY INTO command and MERGEs data into the destination. |
Stream CDC events into Snowflake | When you need to stream updates from the database which supports Change Data Capture (CDC) into Snowflake in real time. |
Stream messages from a queue into Snowflake | When you need to stream messages from the message queue which supports streaming into Snowflake in real time. |
COPY files into Snowflake | When you need to bulk-load data from the file-based or cloud storage, API, or NoSQL database into Snowflake without applying any transformations. This flow requires providing the user-defined COPY INTO command. UnlikeBulk load files into Snowflake, this flow does not support automatic MERGE. |
Videos
ETL, CDC, and bulk load data into Snowflake Watch how to create flows to ETL, CDC and bulk load data into Snowflake |
|
How ETL data into Snowflake A typical Snowflake-optimized flow does the following:
|
Related resources
ELT with Snowflake Etlworks supports executing complex ELT scripts directly in Snowflake, which greatly improves the performance and reliability of the data ingestion. |
Reverse ETL with Snowflake You can use any |
Data type Mapping for Snowflake It is important to understand how we map various JDBC data types for the Snowflake data types.
|
Load multiple tables by a wildcard name You can ETL data from multiple database objects (tables and views) into Snowflake by a wildcard name without creating individual source-to-destination transformations. |
Setup incremental change replication using a high watermark (HWM) Using HWM replication you can load only new and updated records into Snowflake.
|
Automatic creation of the Snowflake stage Etlworks can automatically create an internal or external Snowflake stage. |
Related case studies
Streaming data in real time from 1500+ MySQL databases to Snowflake using CDC
|
Intertek Alchemy, a global leader in workforce training solutions, faced a monumental challenge: seamlessly streaming real-time Change Data Capture (CDC) events from over 1,500 MySQL databases into Snowflake. Despite exploring multiple data integration platforms, no solution on the market could meet their scalability and real-time performance requirements—until they discovered Etlworks. |
Loading data from 600+ SQL Servers behind a firewall to Snowflake
|
A leading retail chain with over 600 locations faced the challenge of integrating data from individual SQL Server databases hosted behind firewalls, with no direct access from a central location or the cloud. Using Etlworks, they deployed remote data integration agents to process millions of records daily from hundreds of tables and load the data into Snowflake—all managed centrally from the cloud. |
Real-time CDC data integration from MongoDB and MySQL to Snowflake.
|
Spireon, a leader in connected vehicle intelligence, leverages Etlworks to enable seamless real-time data integration. By utilizing Etlworks’ Change Data Capture (CDC) capabilities, Spireon efficiently streams data from MongoDB and MySQL into Snowflake, ensuring up-to-the-minute insights for their data-driven operations. |
|
Comments
0 comments
Please sign in to leave a comment.