Skip to content

skevin-dev/Data-Warehouse-Migration

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

11 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Data-Warehouse-Migration

Table of Contents
  1. About The Project
  2. Getting Started
  3. License
  4. Contact
  5. Acknowledgements

About The Project

ELT A dockerized Extract, Load, Transform (ELT) pipeline with PostgreSQL, Airflow, DBT, and a Redash.

Built With

Tech Stack used in this project

  • Docker
  • Postgres
  • MySQL
  • Airflow
  • DBT
  • Redash
  • Superset

Getting Started

Prerequisites

Make sure you have docker installed on local machine.

  • Docker
  • Docker Compose

Installation

  1. Clone the repo

    git clone https://github.com/skevin-dev/Data-Warehouse-Migration
  2. Navigate to the folder

    cd Data-Warehouse-Migration
  3. Build an airflow image

    docker build . --tag apache_dbt/airflow:2.3.3
  4. Run

     docker-compose up
  5. Open Airflow web browser

    Navigate to `http://localhost:8089/` on the browser
    activate and trigger load_dag
    activate and trigger dbt_dag
  6. Access redash dashboard

    Navigate to `http://localhost:5000/` on the browser
  7. Access superset dashboard

cd superset 
Run docker-compose -f docker-compose-non-dev.yml pull
Run docker-compose -f docker-compose-non-dev.yml up
Navigate to `http://localhost:8088/`

License

Distributed under the MIT License. See LICENSE for more information.

Contact

Kevin Shyaka - shyakakevin1@gmail.com

Acknowledgments

About

This project aim to migrate from one tech stack to another

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published