Orchestration (Data Pipelines)

The process of designing, executing, and managing data workflows (pipelines) with multiple dependencies and potentially spanning various tools, applications, and environments.

Added Perspectives
Just as an artist uses a paint palette to access and blend colors, an organization uses an orchestration tool to access and blend multiple tools, systems, and applications into a coherent workflow. These products facilitate a best-of-breed approach, since they link together whatever tools an organization already has in place.
(Blog)
Model execution in the operational environment is supported by model orchestration technology that automates configuration, coordination, and management of the computing environment in which models are executed.
- Dave Wells in DataOps: More Than DevOps for Data Pipelines July 4, 2019
(Blog)
Data orchestration coordinates participation by all stakeholders throughout the end-to-end data workflow. From ingestion and curation to preparation and consumption, workflows are orchestrated through execution of services that encapsulate reusable functions for data capture, storage, harmonization, governance, access, and application. Built on a foundation of AI/ML and automation, orchestration technology eases the pain of configuration, operationalization, and execution of data pipelines and analytics value chains. From a single control platform, orchestration provides the ability to activate any service, using any technology, across any network. Automated coordination of multi-services workflows across all execution environments—on-premises, cloud, multi-cloud, and hybrid—minimizes the manual effort of data operations and supports highly adaptable data management systems that readily adapt to change.
(Report)
Related Terms
  • 0..9
  • A
  • B
  • C
  • D
  • E
  • F
  • G
  • H
  • I
  • J
  • K
  • L
  • M
  • N
  • O
  • P
  • Q
  • R
  • S
  • T
  • U
  • V
  • W
  • X
  • Y
  • Z