Through various conversations with customers and employees in my previous work, I first encountered the term "ETL"—which data people semi-affectionately call "Extremely Tough to Load."

Table of Contents

What is ETL?

ETL is an acronym for "extract, transform, load": the three steps in copying data from a source location to a target location.

  • Extract: First, the data is extracted from its original location (e.g. a database, a spreadsheet, a SaaS platform such as Salesforce, etc.)
  • Transform: Second, the data is cleaned, organized, and converted into the format of the target database.
  • Load: Third, the transformed data is loaded into the target database, where it can be used for analytics and reporting.

Related Reading: What is ETL?

Nearly all data analytics projects treat ETL as a business-critical process—and for good reason. By unifying disparate sources of data and making employees more productive, ETL is a core tool of data integration.

To get a feel for ETL, I started with the basics: chatting with the data engineers around me to hear about their challenges, and researching existing ETL tools in the marketplace.

My logic was this: if I could find an easier way for my clients to do ETL, I’d be able to implement my business intelligence software for them faster. Unfortunately, I encountered the same harsh realities firsthand that people had been telling me about. (Some lessons you just have to learn by yourself.)

The reality is that ETL is complicated, with many obstacles that stand between you and a successful ETL implementation.

3 Common ETL Challenges

In my experience, the 3 challenges below are among the most pressing for businesses that use ETL:

  • Lack of expertise. Without a pre-built ETL solution, engineers have to write custom code for everything from establishing data connections and building pipelines to scheduling and maintenance. This requires expertise that many organizations frankly don't have.
  • Steep learning curve. For those companies who are able to invest in ETL software tools, there's no guarantee that they will be user-friendly, especially for non-technical business users.
  • Uncertainty. The volume and variety of enterprise data will increase over time, often in unpredictable ways. Your choice of ETL solution needs to be resilient, flexible, and scalable.

While I was learning about ETL, I discovered something that piqued my curiosity: Simplified ETL, without any lines of code or need to deploy? It almost sounded too good to be true.

Remaining cautiously optimistic, I resolved to check out for myself. ETL Without a CS Degree

What I found changed my mind about ETL forever. had completely simplified the ETL process so that someone without any programming background (ahem, me) could easily create ETL pipelines in a matter of minutes.

Here's how I got started with

Step 1: Log in to’s interface is cloud-based. To start using, you simply sign up, begin your free trial, and go.

There's no software to install or deploy because is a SaaS product, available through a convenient web interface. How great is that?

thumbnail image

Step 2: Establish the source and destination connections has more than 100 pre-built connections you can choose from, including Amazon S3, Amazon Redshift, Google BigQuery, SFTP, Google AdWords, Salesforce, PostgreSQL, and MongoDB—just to name a few.

thumbnail image

Step 3: Insert your credentials

Add the necessary credentials to your data source and get connected within seconds.thumbnail image

Step 4: Build Your ETL pipeline calls ETL pipelines "packages," and it's incredibly easy to get started building them. The development team clearly invested a lot of time creating a product that's user-friendly for data engineers and BI users alike.

thumbnail image

Step 5: Execute your ETL pipeline

Next, create a cluster (infrastructure) that will execute the job. As the end user, I can decide how much resources I want to allocate just by dragging a slider. This is my favorite feature—it's so easy to scale up or down, depending on the data volume and/or the complexity of the transformation.

thumbnail image

Step 6 (Optional): Create a schedule for your ETL process

With's ability to read incremental data loads, why not automate the ETL process and make life even simpler? (So I did.)

thumbnail image

There were a few other steps in between (for example, defining the business logic within each transformation component). It typically takes under an hour to set up an package and execute it. That's certainly more efficient than spending countless hours coding, or reading through pages and pages of documentation on existing ETL platforms.

With, it was love at first sight—so much so that I joined the team.

If you have a specific use case that you want to consult on, book a time slot for an demo/. We also have express demos that will get you up and running with in minutes. Check us out anytime.

Originally Published: September 27th, 2015