π High Dimensional Model Representation (HDMR) Optimization - Research framework for hyperparameter optimization.
hdmr-opt/
β
βββ src/ # Core HDMR library
β βββ main.py # HDMR optimizer engine
β βββ basis_functions.py # Orthogonal basis functions
β βββ functions.py # Benchmark test functions
β βββ functions_forecast.py # Forecasting models (XGBoost, LSTM, etc.)
β βββ function_ranges.json # Function domains
β βββ optimum_points.json # Known global minima
β βββ data/transactions.csv # Example time series data
β
βββ experiments/ # Main research scripts
β βββ compare_optimizers.py # Compare HDMR vs Optuna vs Random Search
β βββ sensitivity_analysis.py # Hyperparameter importance analysis
β βββ benchmark_forecasting.py # Deep learning benchmarks (LSTM, GRU, N-BEATS)
β βββ forecast_example.py # Single model optimization
β
βββ analysis/ # Visualization & reporting tools
β βββ analyze_results.py # Basic result visualization
β βββ analyze_results_v2.py # Advanced analysis (Pareto fronts, etc.)
β βββ create_final_visualization.py # Publication-ready plots
β βββ create_summary_report.py # Text-based summary reports
β
βββ automation/ # Batch processing scripts
β βββ run_all_experiments.py # Full experimental pipeline
β βββ run_hdmr_clean.sh # Shell-based batch runner
β
βββ legacy/ # Older benchmark scripts
β βββ benchmark_2d.sh # 2D function benchmarks
β βββ forecast_pipeline.py # Legacy forecasting pipeline
β βββ high_dim_test.py # 10D benchmark runner
β
βββ docker/ # Containerization files
β βββ Dockerfile # GPU-accelerated container
β βββ docker-compose.yml # Multi-experiment orchestration
β
βββ app.py # Streamlit web interface
βββ app_utils.py # UI helper functions
βββ requirements.txt # Python dependencies
βββ README.md # This file
# Clone repository
git clone https://github.com/app2scale/hdmr-opt.git
cd hdmr-opt
# Create virtual environment
python -m venv hdmr-env
source hdmr-env/bin/activate
# Install dependencies
pip install -r requirements.txtpython experiments/compare_optimizers.py \
--model xgboost \
--trials 20 \
--seeds 3python experiments/sensitivity_analysis.py \
--model xgboost \
--samples 200 \
--seeds 3python experiments/forecast_example.py \
--algorithm xgboost \
--metric mape \
--samples 1000python experiments/benchmark_forecasting.py \
--models lstm gru nbeats \
--seeds 3# Build image
docker build -t hdmr-opt -f docker/Dockerfile .
# Run single experiment
docker run --gpus all \
-v $(pwd)/results:/workspace/results \
hdmr-opt \
python3 experiments/compare_optimizers.py --model xgboostcd docker
docker-compose up| Script | Purpose | Usage |
|---|---|---|
experiments/compare_optimizers.py |
Compare HDMR vs baselines | --model xgboost --trials 20 |
experiments/sensitivity_analysis.py |
Hyperparameter importance | --model xgboost --samples 200 |
experiments/benchmark_forecasting.py |
Deep learning benchmarks | --models lstm gru nbeats |
experiments/forecast_example.py |
Single optimization | --algorithm xgboost --metric mape |
| Script | Purpose |
|---|---|
analysis/analyze_results.py |
Basic visualization |
analysis/analyze_results_v2.py |
Advanced analysis (Pareto, trade-offs) |
analysis/create_final_visualization.py |
Publication-ready figures |
analysis/create_summary_report.py |
Text summaries |
| Script | Purpose |
|---|---|
automation/run_all_experiments.py |
Run full pipeline |
automation/run_hdmr_clean.sh |
Shell-based batch runs |
from src.main import HDMROptimizer, HDMRConfig
from src.functions_forecast import XGBoostForecaster, prepare_train_test
# Prepare data
data = prepare_train_test('src/data/transactions.csv', '2020-01-01')
# Configure HDMR
config = HDMRConfig(n=5, a=[0.01, 1], b=[0.3, 10], N=1000)
# Optimize
optimizer = HDMROptimizer(objective_function, config)
result = optimizer.solve(x0=[0.1, 5])streamlit run app.pyAccess at: http://localhost:8501
Contributions welcome! Please:
- Fork the repository
- Create a feature branch
- Make your changes
- Submit a pull request
MIT License - see LICENSE file
- Sobol, I. M., et al. (2003) - High Dimensional Model Representation
- Chen, T. & Guestrin, C. (2016) - XGBoost: Scalable Tree Boosting
- Akiba, T., et al. (2019) - Optuna: Hyperparameter Optimization Framework
- Oreshkin, B. N., et al. (2020) - N-BEATS: Neural Basis Expansion
Repository Reorganization:
- Structured directory layout (experiments/, analysis/, automation/)
- Fixed import paths for all scripts
- Updated Docker configuration
- Improved documentation
Features:
- Optimizer comparison framework
- Sensitivity analysis tool
- Deep learning benchmarks
- Docker containerization
Developed by APP2SCALE Team
Core Engine Improvements:
- Robust x0 parsing (supports broadcasting, pattern repeat)
- Always returns OptimizeResult (never None)
- Fixed surrogate evaluation (correct 1D optimization per dimension)
- Numerical stability hardening (NaN/Inf guards, soft bounds)
- Safe visualization (never crashes optimization)
Forecasting Module:
- Strict MM/DD/YYYY date parsing with auto-detection fallback
- Better error messages for date parsing failures
- Safer defaults (no mutable default arguments)
- BaseForecaster class with backward compatibility
Automation:
- Added benchmark_2d.sh for 2D function testing
- Added forecast_pipeline.py for forecasting optimization
- Added high_dim_test.py for 10D function testing
- All scripts use python -m for import safety
Documentation:
- Complete README overhaul
- Added usage examples for all scripts
- Documented production deployment best practices
Features:
- Added --numberOfRuns parameter for statistical analysis
- Improved basis functions module with factory pattern
- Enhanced numerical stability
Infrastructure:
- Better error handling and logging
- Improved test coverage
Initial Release:
- Core HDMR + BFGS implementation
- Streamlit web interface
- Command-line interface
- Basic benchmark functions
- Added --x0 parameter for custom starting points
Developed by APP2SCALE Team