This project focuses on designing and implementing a modern data warehouse using SQL Server. It consolidates sales data from multiple sources into a clean, analytics-ready data model that supports business reporting and decision-making.
It covers both data engineering (building the warehouse 🛠️) and data analytics (deriving insights using SQL 📊).
Design and develop a modern SQL Server–based data warehouse that integrates sales data from multiple sources, enabling reliable analytical reporting and informed business decisions.
-
💾 Data Sources
Import sales data from two source systems (ERP and CRM), provided as CSV files. -
🧹 Data Quality
Cleanse data and resolve quality issues before loading into the warehouse. -
🔗 Data Integration
Merge data from all source systems into a unified, analytics-friendly data model optimized for queries. -
⏱️ Scope
Focus on the most recent snapshot of the data. Historical tracking and slowly changing dimensions are not required. -
📚 Documentation
Provide clear and structured documentation of the data model for both business users and analytics teams.
Develop SQL-based analytical queries to generate meaningful business insights from the data warehouse.
-
👥 Customer Behavior
Analyze purchasing patterns and customer activity. -
🛍️ Product Performance
Evaluate product sales, revenue contribution, and performance trends. -
📈 Sales Trends
Identify sales patterns over time to support strategic planning.
These analytics provide stakeholders with key metrics for data-driven decisions.
The data architecture for this project follows Medallion Architecture Bronze, Silver, and Gold layers:

- Bronze Layer: Stores raw data as-is from the source systems. Data is ingested from CSV Files into SQL Server Database.
- Silver Layer: This layer includes data cleansing, standardization, and normalization processes to prepare data for analysis.
- Gold Layer: Houses business-ready data modeled into a star schema required for reporting and analytics.
- SQL Server
- Azure Data Studio (SQL client 💻)
- CSV-based source data 📄
- Set up SQL Server in your local environment.
- Load source CSV files into staging tables.
- Execute ETL scripts to build the data warehouse 🏗️.
- Run analytical SQL queries to explore insights 📊.
data-warehouse-project/
│
├── datasets/ # Raw datasets used for the project (ERP and CRM data)
│
├── docs/ # Project documentation and architecture details
│ ├── etl.drawio # Draw.io file shows all different techniquies and methods of ETL
│ ├── data_architecture.drawio # Draw.io file shows the project's architecture
│ ├── data_catalog.md # Catalog of datasets, including field descriptions and metadata
│ ├── data_flow.drawio # Draw.io file for the data flow diagram
│ ├── data_models.drawio # Draw.io file for data models (star schema)
│ ├── naming-conventions.md # Consistent naming guidelines for tables, columns, and files
│
├── scripts/ # SQL scripts for ETL and transformations
│ ├── bronze/ # Scripts for extracting and loading raw data
│ ├── silver/ # Scripts for cleaning and transforming data
│ ├── gold/ # Scripts for creating analytical models
│
├── tests/ # Test scripts and quality files
│
├── README.md # Project overview and instructions
├── LICENSE # License information for the repository
├── .gitignore # Files and directories to be ignored by Git
└── requirements.txt # Dependencies and requirements for the project
This project is licensed under the MIT License 📝.