Table of Contents
A dockerized Extract, Load, Transform (ELT) pipeline with PostgreSQL, Airflow, DBT, and a Redash.
Tech Stack used in this project
Make sure you have docker installed on local machine.
- Docker
- Docker Compose
-
Clone the repo
git clone https://github.com/skevin-dev/Data-Warehouse-Migration
-
Navigate to the folder
cd Data-Warehouse-Migration -
Build an airflow image
docker build . --tag apache_dbt/airflow:2.3.3 -
Run
docker-compose up
-
Open Airflow web browser
Navigate to `http://localhost:8089/` on the browser activate and trigger load_dag activate and trigger dbt_dag
-
Access redash dashboard
Navigate to `http://localhost:5000/` on the browser
-
Access superset dashboard
cd superset
Run docker-compose -f docker-compose-non-dev.yml pull
Run docker-compose -f docker-compose-non-dev.yml up
Navigate to `http://localhost:8088/`Distributed under the MIT License. See LICENSE for more information.
Kevin Shyaka - shyakakevin1@gmail.com