An end-to-end GoodReads Data Pipeline for Building Data Lake, Data Warehouse and Analytics Platform.
-
Updated
Mar 9, 2020 - Python
An end-to-end GoodReads Data Pipeline for Building Data Lake, Data Warehouse and Analytics Platform.
This repository is no longer maintained.
This project demonstrates how to build and automate an ETL pipeline written in Python and schedule it using open source Apache Airflow orchestration tool on AWS EC2 instance.
An airflow DAG transformation framework
Build a data warehouse from scratch, including full load, daily incremental load, design schema, SCD Type 1 and 2.
Фабрика DAG
Analysing live tweets from twitter by generating a big data pipeline and scheduling it with Airflow (Using also Kafka for tweet ingestion, Cassandra for storing parsed tweets, and Spark for Analysis)
Apache Airflow demo project that setup 3 DAGs to explain how to pass parameters from a DAG to a triggered DAG.
An end-to-end GoodReads Data Pipeline for Building Data Lake, Data Warehouse and Analytics Platform.
This is an ELT data pipeline setup to track the activities of an e-commerce website based on orders, reviews, deliveries and shipment date. This project utilized technologies like Airflow, AWS RDS-Postgres, Python etc.
Datitos - TP2 with steroids
Data Engineering Projects on data modelling, data warehousing, data lake development, orchestration and analysis
Creation of the almost-real time data processing pipeline for the Pintrest posts.
A dashboard with sentiment scores of the tweets
Small project to play around Apache Airflow and ETL
Orchestrate data pipeline using airflow
Add a description, image, and links to the airflow-dag topic page so that developers can more easily learn about it.
To associate your repository with the airflow-dag topic, visit your repo's landing page and select "manage topics."