A simple FaaS-API application designed to convert Business Processes in BPMN notation to PNML and vice versa.
Please refer to this repo's wiki for more information. To use the API, refer to its documentation.
# Install dependencies
pip install -r requirements/dev.txt
# Set required environment variables
export FORCE_STD_XML=true
export FLASK_CONFIG=development
# Run development server
flask runexport FORCE_STD_XML=true
pytest tests/
# or with coverage
pytest tests/ --cov=appE2E tests require a running server instance and are skipped by default. To run E2E tests:
-
Start the application server:
export FORCE_STD_XML=true export FLASK_CONFIG=development flask run
-
In a separate terminal, set E2E environment variable and run tests:
export E2E_URL=http://localhost:5000 # Base URL for health/checkTokens # For transform endpoint: http://localhost:5000/transform # Run only E2E tests pytest tests/ -m e2e # Run all tests including E2E pytest tests/
To exclude E2E tests (default behavior when environment variables are not set):
pytest tests/ -m "not e2e"gunicorn wsgi:appAfter cloning this repository, it's essential to set up git hooks to ensure project standards.
model-transformer/
├── config.py # Root-level configuration
├── wsgi.py # WSGI entry point
├── app/
│ ├── __init__.py # Application factory (create_app)
│ ├── logging_config.py # JSON logging configuration
│ ├── api/
│ │ ├── __init__.py # API blueprint definition
│ │ └── routes.py # Consolidated API routes
│ ├── model_transformer/
│ │ ├── __init__.py
│ │ └── metrics.py # Prometheus metrics
│ ├── health/ # Health check logic
│ ├── transform/ # Model transformation logic
│ ├── checkTokens/ # Token validation logic
│ └── ...
├── tests/
│ ├── checkTokens/
│ ├── health/
│ └── transform/
├── requirements/
│ ├── base.txt
│ ├── dev.txt
│ ├── docker.txt
│ ├── prod.txt
│ └── test.txt
└── docs/
└── ...
The Flask app is created using the factory pattern via create_app():
from app import create_app
# Create app with defaults
app = create_app()
# Create app with specific config
app = create_app('development')
app = create_app('testing')
app = create_app('production')The factory handles:
- Configuration loading from
config.py - JSON logging setup via
logging_config.py - CORS configuration
- Blueprint registration
- Error handlers
- Request/response middleware
Centralized configuration management with environment-based loading:
class Config:
"""Base configuration"""
ENV_NAME = "default"
DEBUG = False
TESTING = False
LOG_LEVEL = "INFO"
class DevelopmentConfig(Config):
DEBUG = True
class TestingConfig(Config):
TESTING = True
class ProductionConfig(Config):
pass
def get_config(config_name=None):
"""Get config class by name, auto-detects from environment variables"""Environment Variables:
FLASK_CONFIG=development|testing|production # Explicit config selection
APP_ENV=development|testing|production # Alternative config selection
LOG_LEVEL=DEBUG|INFO|WARNING|ERROR # Logging level (default: INFO)
FORCE_STD_XML=true # Required for transform moduleAll routes are consolidated under a single API blueprint:
# app/api/__init__.py
bp = Blueprint('api', __name__)
# app/api/routes.py
@bp.route('/health', methods=['GET'])
def health():
"""Health check endpoint"""
@bp.route('/transform', methods=['POST'])
def transform():
"""Model transformation endpoint"""
@bp.route('/metrics', methods=['GET'])
def metrics():
"""Prometheus metrics endpoint"""wsgi.py - Main entry point for WSGI servers and CLI:
app = create_app()
@app.cli.command("test")
@click.option('--cov', is_flag=True, help="Show test coverage report.")
def test_command(cov):
"""Run all tests in the 'tests/' directory."""app/__init__.py - Contains create_app() factory function
Health check endpoint for monitoring application status.
Response:
{
"status": "healthy",
"timestamp": "2026-02-07T12:00:00Z"
}Transform models between BPMN and PNML formats.
Request Body:
{
"data": "...",
"format": "bpmn|pnml"
}Response: The transformed model in the requested format.
Prometheus metrics endpoint for monitoring.
Metrics:
http_requests_total- Total HTTP requests by method, endpoint, and statushttp_request_duration_seconds- HTTP request duration histogramtransform_duration_seconds- Model transformation duration
export FLASK_CONFIG=development
export FORCE_STD_XML=true
export LOG_LEVEL=DEBUG
flask run --reloadexport FLASK_CONFIG=testing
export FORCE_STD_XML=true
# Run tests
pytest tests/
# Run tests with coverage
pytest tests/ --cov=app --cov-report=htmlexport FLASK_CONFIG=production
export FORCE_STD_XML=true
export LOG_LEVEL=INFO
gunicorn wsgi:app \
--workers 4 \
--threads 2 \
--bind 0.0.0.0:8080FROM python:3.13
WORKDIR /app
COPY requirements/docker.txt .
RUN pip install -r docker.txt
COPY . .
ENV FORCE_STD_XML=true
ENV FLASK_CONFIG=production
CMD ["gunicorn", "wsgi:app", "--bind", "0.0.0.0:8080"]All tests are located in the tests/ directory, organized by module:
tests/
├── checkTokens/
│ ├── e2e/
│ └── unit/
├── health/
│ ├── e2e/
│ └── unit/
└── transform/
├── e2e/
├── unit/
├── assets/ # Test fixtures
└── testgeneration/ # Test case generation
# Collect all tests
pytest tests/ --collect-only
# Run all tests
pytest tests/
# Run specific module
pytest tests/transform/ -v
# Run with coverage
pytest tests/ --cov=app --cov-report=term-missing
# Run via CLI command
python wsgi.py test # Run all tests
python wsgi.py test --cov # Run with coverageThe project uses pytest with coverage reporting:
pytest tests/ --cov=app --cov-report=html
# Open htmlcov/index.html in browserThe Flask application has been refactored to follow the well-structured pattern used in reference projects (t2p-2.0 and t2p-llm-api-connector):
-
Application Factory Pattern
- Single
create_app()function inapp/__init__.py - Centralized Flask configuration and initialization
- Enables easy testing with different configurations
- Single
-
Root-Level Configuration
config.pycontains all configuration classes- Environment-based configuration resolution
- Supports multiple deployment environments
-
Unified API Blueprint
- All routes in
app/api/package - Single
routes.pyfile with all endpoints - Cleaner, more maintainable structure
- All routes in
-
Test Consolidation
- All tests moved to
tests/directory at project root - Tests organized by module (checkTokens, health, transform)
- Proper Python package structure with
__init__.pyfiles
- All tests moved to
-
Entry Point Simplification
wsgi.pyis main entry point (removed redundantapp/app.py)- Cleaner WSGI configuration
- CLI test commands available
| Aspect | Before | After |
|---|---|---|
| Configuration | Scattered across modules | Centralized in config.py |
| Factory Logic | In app/model_transformer/__init__.py |
In app/__init__.py |
| Routes | Multiple files in app/model_transformer/routes/ |
Single blueprint in app/api/routes.py |
| Tests | Distributed in each module | Consolidated in tests/ directory |
| Entry Point | wsgi.py → app/app.py → factory |
Direct wsgi.py → factory |
The refactored structure has been verified:
- ✅ Application loads successfully
- ✅ All 3 API routes registered
- ✅ 22 unit and integration tests pass
- ✅ Prometheus metrics working
- ✅ Backward compatibility maintained
- Consistency: Matches proven patterns from reference projects
- Maintainability: Centralized configuration and clear module organization
- Testability: Factory pattern enables comprehensive testing
- Scalability: Blueprint structure easy to extend
- Code Quality: Better separation of concerns
The application uses JSON-formatted structured logging via python-json-logger:
Logging is configured in app/logging_config.py with:
- JSON formatter for structured logging
- Request context filters (request_id, method, path)
- Metrics filter to exclude noisy
/metricsendpoint
{
"asctime": "2026-02-07 12:00:00",
"levelname": "INFO",
"name": "app.api.routes",
"message": "Transform request received",
"request_id": "abc123def456",
"http_method": "POST",
"http_path": "/transform"
}LOG_LEVEL=DEBUG|INFO|WARNING|ERROR # Default: INFOPrometheus metrics are available at the /metrics endpoint:
http_requests_total{method, endpoint, status}- Total HTTP requestshttp_request_duration_seconds{method, endpoint}- Request latency histogramtransform_duration_seconds- Model transformation duration
curl http://localhost:5000/metrics- Flask 3.0+ - Web framework
- flask-cors - CORS support
- prometheus-client - Metrics collection
- python-json-logger - Structured JSON logging
- pydantic - Data validation
- lxml - XML processing
- requests - HTTP client
- pytest - Testing framework
- pytest-cov - Coverage reporting
- python-dotenv - Environment variables
See requirements/ directory for complete dependency lists:
base.txt- Core dependenciesdev.txt- Development dependenciestest.txt- Testing dependenciesprod.txt- Production dependenciesdocker.txt- Docker image dependencies
Install dependencies:
pip install -r requirements/dev.txtRequired environment variable not set:
export FORCE_STD_XML=trueEnsure all test directories have __init__.py files:
find tests -type d -exec touch {}/__init__.py \;CORS is configured in app/__init__.py to allow all origins. If issues persist, check config.py CORS settings.
- Create a branch for your feature
- Write tests first (TDD approach)
- Implement changes in
app/modules - Run tests locally before pushing
- Update documentation if needed
- Submit pull request with clear description
# Set up environment
export FORCE_STD_XML=true
export FLASK_CONFIG=development
# Run all tests
pytest tests/ -v
# Run specific test
pytest tests/transform/unit/test_transform.py::TestBPMNToPetriNet -v
# Run with coverage
pytest tests/ --cov=app --cov-report=term-missingPlease see CONTRIBUTING.md for guidelines on:
- Code style
- Commit messages
- Pull request process
- Issue reporting
See LICENSE for details.
Last Updated: February 7, 2026