Data Pipeline

A workflow that moves data from one or more sources to one or more targets and often transforms it in the process. Typical pipelines include extract, transform, and load (ETL), extract, load, transform (ELT), and streaming.

Added Perspectives
Data pipelines are the means by which we move data through today’s complex analytics ecosystems.
- Dave Wells in The Complexities of Modern Data Pipelines July 4, 2018
(Blog)
Data pipeline automation uses technology to gain efficiency and improve effectiveness for data pipeline development and operations. And it goes beyond simply automating the development process to encompass all aspects of pipeline engineering and operations including design, development, testing, deployment, orchestration, and change management.
(Report)
The purpose of a data pipeline is to move data from an origin to a destination. There are many different kinds of data pipelines: integrating data into a data warehouse, ingesting data into a data lake, flowing real-time data to a machine learning application, and many more. The variation in data pipelines depends on several factors that influence the shape of the solution.
- Dave Wells in Data Pipeline Design Patterns May 6, 2020
(Blog)
  • 0..9
  • A
  • B
  • C
  • D
  • E
  • F
  • G
  • H
  • I
  • J
  • K
  • L
  • M
  • N
  • O
  • P
  • Q
  • R
  • S
  • T
  • U
  • V
  • W
  • X
  • Y
  • Z