Releases: ShimantoRahman/empulse
0.10.4
Changed to Cython implementation for the loss functions and impurity measures of
CSLogitClassifier,CSBoostClassifier,CSTreeClassifier, andCSForestClassifier. This improves the training time and memory efficiency of these models significantly. Training time speedups observed were up to 300x forCSTreeClassifierandCSForestClassifier, 30x forCSLogitClassifier, and 1.5x forCSBoostClassifierdepending on the dataset size and parameters.Changed arguments to
CSTreeClassifier,CSForestClassifier, andCSBaggingClassifierto be in line with scikit-learn's decision tree and ensemble models.CSForestClassifier, andCSBaggingClassifierno longer support stacking combination method. UseStackingClassifierinstead for stacking.Extracted the construction of the cost matrix into a separate class
CostMatrixaway fromMetricto allow reusing the cost matrix in custom metrics.ProfLogitClassifierno longer uses the EMPC metric by default. Users now need to explicitely pass a loss to the model.CSLogitClassifierno longer accepts any callable as loss function. Users now need to pass aMetricinstance for a custom loss function.savings_scoreandexpected_savings_scorenow accept two more baseline options'one'and'zero'to always predict the positive and negative class, respectively.Metrics with with the
Savingsstrategy now also accepts baseline options likesavings_scoreandexpected_savings_score.Models which use a
Metricinstance as their loss function with theCostorSavingsstrategy as their loss function now are pickleable. TheMaxProfitstrategy will be updated to be pickleable in a future release.Models which use a
Metricinstance as their loss function can now request arguments necessary for the metric to be passed during the fit method through Metadata Routing.Fix
CSLogitClassifiernot properly calculating gradient penalty.Fix default values not being properly when using aliases in
CostMatrix.Fix
Metricthrowing errors when certain terms cancelled out.
0.10.3
0.9.0
-
Added
optimal_thresholdandoptimal_ratemethods to calculate the optimal threshold(s) and optimal predicted positive rate for a given metric. This is useful for determining the best decision threshold and predicted positive rate for a cost-sensitive or value-driven model. -
CSTreeClassifier,CSForestClassifier, andCSBaggingClassifiercan now take aMetricinstance as their criterion to optimize. -
CSThresholdClassifiercan now take aMetricinstance to choose the optimal decision threshold. -
RobustCSClassifiercan now take estimators with aMetricinstance as the loss function or criterion.RobustCSClassifierwill treat any cost marked as outlier sensitive. This can be done by using themark_outlier_sensitivemethod. -
Allow savings metrics to be used in
CSBoostClassifierandCSLogitClassifieras the objective function. Internally, the expected cost loss is used to train the model, since the expected savings score is just a transformation of the expected cost loss. -
kindargument toMetrichas been replaced bystrategy. TheMetricclass now takes aMetricStrategyinstance. This change allows for more flexibility in defining the metric strategy.
The currently available strategies are: -
Fix error when importing Empulse without any optional dependencies installed.
-
Fix
CSLogitClassifiernot properly using the gradient when using a custom loss function fromMetric. -
Fix models throwing errors when differently shaped costs are passed to the fit or predict method.
-
Fix sympy distribution parameters not being properly translated to scipy distribution parameters when using the
MaxProfitstrategy (formerlykind='max profit') with the quasi monte-carlo integration method.
0.8.0
CSBoostClassifier,CSLogitClassifier, andProfLogitClassifiercan now take aMetricinstance as their loss function.
Internally, the metric instance is converted to the appropriate loss function for the model.
For more information, read the User Guide.Type hints are now available for all functions and classes.
Add support for more than one stochastic variable when building maximum profit metrics with
MetricAllow
Metricto be used as a context manager.
This ensures the metric is always built after defining the cost-benefit elements.Fix datasets not properly being packaged together with the package
Fix
RobustCSClassifierwhen array-like parameters are passed to fit method.Fix boosting models being biased towards the positive class.
Full Changelog: 0.7.0...0.8.0
0.7.0
Add
CSTreeClassifier,CSForestClassifier,
andCSBaggingClassifierto support cost-sensitive decision tree and ensemble modelsAdd support for scikit-learn 1.5.2 (previously Empulse only supported scikit-learn 1.6.0 and above).
Removed the
emp_scoreandempfunctions from themetricsmodule.
Use theMetricclass instead to define custom expected maximum profit measures.
For more information, read the User Guide.Removed numba as a dependency for Empulse. This will reduce the installation time and the size of the package.
Fix
Metricwhen defining stochastic variable with fixed values.Fix
Metricwhen stochastic variable has infinite bounds.Fix
CSThresholdClassifier
when costs of predicting positive and negative classes are equal.Fix documentation linking issues to sklearn
Full Changelog: 0.6.0...0.7.0
0.6.0
-
Add
Metricto easily build your own value-driven and cost-sensitive metrics -
Add support for LightGBM and Catboost models in
CSBoostClassifierand
B2BoostClassifier -
make_objective_churnandmake_objective_acquisition
now take amodelargument to calculate the objective for either XGBoost, LightGBM or Catboost models. -
XGBoost is now an optional dependency together with LightGBM and Catboost. To install the package with
XGBoost, LightGBM and Catboost support, use the following command:pip install empulse[optional] -
Renamed
y_pred_baselineandy_proba_baselinetobaselineinsavings_score
andexpected_savings_score. It now accepts the following arguments:- If
'zero_one', the baseline model is a naive model that predicts all zeros or all ones
depending on which is better. - If
'prior', the baseline model is a model that predicts the prior probability of
the majority or minority class depending on which is better (not available for savings score). - If array-like, target probabilities of the baseline model.
- If
-
Update the descriptions attached to each dataset to match information found in the user guide
Full Changelog: 0.5.2...0.6.0
0.5.2
- Allow savings_score and expected_savings_score
to calculate the savings score over the baseline model instead of a naive model,
by setting the y_pred_baseline and y_proba_baseline parameters, respectively. - Reworked the user guide documentation to better explain the usage of value-driven
and cost-sensitive models, samplers and metrics - CSLogitClassifier and ProfLogitClassifier
by default do not perform soft-thresholding on the regression coefficients.
This can be enabled by setting the soft_threshold parameter to True. - Prevent division by zero errors in expected_cost_loss
Full Changelog: 0.5.1...0.5.2
0.5.1
- Fix documentation build issue
Full Changelog: 0.5.0...0.5.1
0.5.0
- Added supported for python 3.13
- Added cost-sensitive models
- CSLogitClassifier
- CSBoostClassifier
- RobustCSClassifier
- CSThresholdClassifier
- Added cost-sensitive metrics
- cost_loss
- expected_cost_loss
- expected_log_cost_loss
- savings_score
- expected_savings_score
- Added cost-sensitive sampler
- CostSensitiveSampler
- Added datasets module
- rename metric arguments which expect target score from y_pred to y_score and
target probabilities from y_pred to y_proba. - Allow all cost-sensitive models and samplers to accept cost parameters during initialization
Full Changelog: 0.4.6...0.5.0
0.4.6
Full Changelog: 0.4.0...0.4.6