Event Driven Orchestration & Scheduling Platform for Mission Critical Applications
-
Updated
Dec 26, 2025 - Java
Event Driven Orchestration & Scheduling Platform for Mission Critical Applications
Alluxio, data orchestration for analytics and machine learning in the cloud
cloud-native distributed storage
An open source, standard data file format for graph data storage and retrieval.
A full data warehouse infrastructure with ETL pipelines running inside docker on Apache Airflow for data orchestration, AWS Redshift for cloud data warehouse and Metabase to serve the needs of data visualizations such as analytical dashboards.
Best practices for data workflows, integrations with the Modern Data Stack (MDS), Infrastructure as Code (IaC), Cloud Provider Services
Build and ship production ML pipelines faster: a pipeline library with an optional self-hosted visual layer for modular, reproducible workflows, local testing, and experiment tracking.
Data-aware orchestration with dagster, dbt, and airbyte
Data Engineering - Metropolitan Transportation Authority (MTA) Subway Data Analysis
This repo contains a dataset, exercises, and sample code for an end-to-end SAP BTP data-to-value bootcamp covering SAP HANA Cloud, SAP Data Warehouse Cloud, SAP Data Intelligence Cloud, and SAP Analytics Cloud.
A new Airflow Provider for Fivetran, maintained by Astronomer and Fivetran
ChronoGrapher is a Job Scheduler And Workflow Orchestration Platform, focusing on polyglotness, developer experience and performance
Get started with Dagster ASAP
CI/CD repository template to automate deployments of your production flows
An operator for managing Alluxio system on Kubernetes cluster
Asset-first data orchestration for Elixir/BEAM. Dagster-inspired with OTP fault tolerance, LiveView dashboard, lineage tracking, checkpoint gates, and distributed execution via Oban.
A simple pipeline infrastructure with ETL pipeline contained in a Docker environment on Apache Airflow for orchestration and Postgres for data warehousing
Bring Infrastructure as Code best practices to your data workflows with Kestra and Terraform
Develop a real-time data ingestion pipeline using Kafka and Spark. Collect minute-level stock data from Yahoo Finance, ingest it into Kafka, and process it with Spark Streaming, storing the results in Cassandra. Orchestrated the workflow using Airflow deployed on Docker.
Introduction to using and scaling dagster
Add a description, image, and links to the data-orchestration topic page so that developers can more easily learn about it.
To associate your repository with the data-orchestration topic, visit your repo's landing page and select "manage topics."