The ELT (Extract, Load, Transform) Resource Launchpad

Dan LeBlanc
September 1, 2022

Cloud data warehousing has facilitated innovation and modernization in data pipeline builds. ELT is the new standard of processes for data pipelines and allows large data loads to be handled fast and efficiently.

This page is Daasity’s ever-growing resource launchpad for ELT information and content:

  • What is ELT?
  • How to approach ELT/data stack builds
  • Information on Daasity’s data models 

What is ELT?

ELT stands for “Extract, Load, Transform:” these are the three major steps required to sync data from a data source (e.g., Shopify Plus or Klaviyo) into a repository, which is commonly a cloud data warehouse (e.g., Snowflake or BigQuery). 

  • Extract: Copying data from the data source, often via API
  • Load: Replicating the data, storing in a database
  • Transform: Reformatting and/or normalizing data for analysis and (commonly) visualization.

Building an ELT Pipeline and Choosing a Data Warehouse

If you’re looking for resources on building an ELT process for your brand, we’ve got you covered.

Building ELT + Analytics using data tools (e.g., Fivetran, dbt)

If you’re looking to Stitch together (bad pun) a data pipeline using data tools out there, you’ll need to connect the following:

  • An Extract/Load provider (such as Fivetran, Stitch)
  • A Data Warehouse provider (such as Snowflake or BigQuery)
  • A Transformation tool (such as dbt)
  • A visualization tool (such as Looker or Tableau)
  • A data orchestration tool (such as Airflow)
  • A reverse ETL tool (such as Hightouch)

We’ve built a piece on building your own data stack stack that covers this process, as well as expected timelines and costs (both initial and ongoing costs) associated with the project. 

Choosing a Cloud Data Warehouse

As an expansion on one element of the DIY analytics stack, we cover choosing a cloud data warehouse in greater detail. In this article, we compare Snowflake, BigQuery, and Redshift. 

Building ELT from Scratch

If you’re interested in or looking to learn more about what it takes to build an ELT pipeline from the ground up, we wrote an article about what would go into this build, as well as the complexities and challenges of the process.

We use Shopify’s API as an example of a data extraction source, and talk about the Transformation side of the process.

Data Transformation Resources

Daasity’s Data Models

Unified Order Schema (UOS)

Daasity's Unified Order Schema is a core data model within the Daasity transformation module that helps accelerate development of analytical capability by normalizing all commerce data: eCommerce, Marketplace, Retail and Wholesale. We break down the "why" of each component of the data model, and we link to our documentation and link to the corresponding KB doc that covers the "what" by providing a column-by-column breakdown of the data model.

Unified Notifications Schema (UNS)

Daasity's Unified Notification Schema is another core data model that make it easy to centralize and analyze data from multiple communication platforms, such as: 

  • Email via an Email Service Provider (ESP)
  • SMS via SMS platforms
  • Push notifications or in-app notifications via relevant providers
Text Link

May interest you

Stay Data-Driven.

Get tactical insights delivered to your inbox.

< src="https://unpkg.com/@lottiefiles/lottie-player@latest/dist/lottie-player.js" defer>