Skip to content

Akmalwizdom/siprems

Repository files navigation

SIPREMS: Smart POS System Interface

Project Overview

SIPREMS (Sistem Prediksi Stok Musiman) is an advanced, integrated solution designed to modernize retail management. It combines a robust Point of Sale (POS) interface with state-of-the-art Machine Learning forecasting capabilities to empower businesses with actionable insights.

Beyond standard transaction processing, SIPREMS leverages historical data to predict future sales trends, enabling smarter inventory management and strategic decision-making. The system is built on a modern microservices architecture, ensuring scalability, maintainability, and high performance.

Key Features

  • Intelligent Dashboard: A comprehensive command center visualizing real-time sales data, revenue metrics, and performance key performance indicators (KPIs) through interactive charts and graphs.
  • Predictive Analytics: Integrated Machine Learning models (Prophet) that analyze historical sales data to forecast future demand, helping businesses optimize stock levels and reduce waste.
  • Smart Point of Sale (POS): A streamlined, user-friendly interface for processing transactions efficiently, designed to minimize training time for staff.
  • Report Generation: Automated generation of detailed PDF and Excel reports for sales, inventory, and financial auditing.
  • AI-Powered Insights: Utilization of Google Gemini AI to provide qualitative analysis and business recommendations based on quantitative data.
  • Secure Authentication: Role-based access control and secure user authentication managed via Supabase.

System Architecture

The project adopts a containerized microservices architecture to ensure separation of concerns and independent scalability:

  • Frontend Service: A responsive Single Page Application (SPA) built with React and Vite, utilizing Material UI for a polished design system.
  • Backend Service: A robust RESTful API built with Node.js, Express, and TypeScript, handling business logic, data integration, and secure communication with the database.
  • Machine Learning Service: A dedicated Python-based service running Prophet models for time-series forecasting, communicating with the backend via internal APIs.
  • Database: PostgreSQL (via Supabase) serves as the primary data store, ensuring data integrity and reliability.

Technology Stack

Frontend

  • Core: React, TypeScript, Vite
  • Styling: Tailwind CSS, Material UI (MUI), Styled Components
  • State Management & Data Fetching: TanStack Query (React Query)
  • Data Visualization: Recharts, MUI X Charts
  • Utilities: PDF generation (jspdf), Excel export (xlsx), Lucide React

Backend

  • Runtime: Node.js
  • Framework: Express.js
  • Language: TypeScript
  • Database & Auth: Supabase (PostgreSQL)
  • AI Integration: Google Gemini API, Firebase

Machine Learning

  • Language: Python
  • Libraries: Prophet, Pandas, Scikit-learn
  • API: Fast API / Flask (implied for service communication)

DevOps & Infrastructure

  • Containerization: Docker, Docker Compose
  • Configuration: Environment variables (.env)

Getting Started

Follow these instructions to set up the project locally for development and testing purposes.

Prerequisites

Ensure you have the following installed on your system:

  • Docker Desktop (Recommended for easiest setup)
  • Node.js (v18 or higher)
  • npm or yarn
  • Git

Installation

  1. Clone the Repository

    git clone https://github.com/yourusername/siprems-cd.git
    cd siprems-cd
  2. Environment Configuration Create a .env file in the backend-ts directory and the root directory as needed. Refer to .env.example if available.

    Required Environment Variables (Example):

    DATABASE_URL=postgresql://user:password@host:5432/db
    SUPABASE_URL=https://your-project.supabase.co
    SUPABASE_ANON_KEY=your-anon-key
    GEMINI_API_KEY=your-gemini-api-key

Running with Docker (Recommended)

The most efficient way to run the entire stack is using Docker Compose.

  1. Build and Start Services

    docker-compose up --build
  2. Access the Application

Running Locally (Manual)

If you prefer to run services individually without Docker:

1. Backend Setup

cd backend-ts
npm install
npm run dev

2. Frontend Setup

# Return to root
cd ..
npm install
npm run dev

3. ML Service Setup

cd ml-service
# Create virtual environment usually recommended
pip install -r requirements.txt
python main.py # or appropriate entry point

Project Structure

siprems-cd/
├── backend-ts/       # TypeScript Backend Source Code
├── landing-page/     # Product Landing Page
├── ml-service/       # Python Machine Learning Service
├── src/              # Main React Frontend Source Code
├── public/           # Static Assets
├── docker-compose.yml# Docker Orchestration Config
└── package.json      # Project Dependencies

Contributing

Contributions are welcome. Please ensure that you follow the established code style and commit conventions.

  1. Fork the repository.
  2. Create your feature branch (git checkout -b feature/AmazingFeature).
  3. Commit your changes (git commit -m 'Add some AmazingFeature').
  4. Push to the branch (git push origin feature/AmazingFeature).
  5. Open a Pull Request.

License

This project is licensed under the MIT License - see the LICENSE file for details.


Note: This project is part of an academic assignment for the Technopreneurship course. All data used is for demonstration purposes.

About

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors