Releases: microsoft/FLAML
Releases · microsoft/FLAML
v0.6.7
What's Changed
- remove big objects after fit by @sonichi in #176
- remove catboost training dir by @sonichi in #178
- Forecast v2 by @int-chaos in #182
- Fix decide_split_type bug. by @gianpdomiziani in #184
- Cleanml by @qingyun-wu in #185
- warmstart blendsearch by @sonichi in #186
- variable name by @sonichi in #187
- notebook example by @sonichi in #189
- make flaml work without catboost by @sonichi in #197
- package name in setup by @sonichi in #198
- clean up forecast notebook by @sonichi in #202
- consider num_samples in bs thread priority by @sonichi in #207
- accommodate nni usage pattern by @sonichi in #209
- random search by @sonichi in #213
- add consistency test by @qingyun-wu in #216
- set converge flag when no trial can be sampled by @sonichi in #217
- seed for hpo method by @sonichi in #224
- update config if n_estimators is modified by @sonichi in #225
- warning -> info for low cost partial config by @sonichi in #231
- Consistent California by @cdeil in #245
- Package by @sonichi in #244
New Contributors
Full Changelog: v0.6.0...v0.6.7
v0.6.6
What's Changed
- remove big objects after fit by @sonichi in #176
- remove catboost training dir by @sonichi in #178
- Forecast v2 by @int-chaos in #182
- Fix decide_split_type bug. by @gianpdomiziani in #184
- Cleanml by @qingyun-wu in #185
- warmstart blendsearch by @sonichi in #186
- variable name by @sonichi in #187
- notebook example by @sonichi in #189
- make flaml work without catboost by @sonichi in #197
- package name in setup by @sonichi in #198
- clean up forecast notebook by @sonichi in #202
- consider num_samples in bs thread priority by @sonichi in #207
- accommodate nni usage pattern by @sonichi in #209
- random search by @sonichi in #213
- add consistency test by @qingyun-wu in #216
- set converge flag when no trial can be sampled by @sonichi in #217
- seed for hpo method by @sonichi in #224
- update config if n_estimators is modified by @sonichi in #225
- warning -> info for low cost partial config by @sonichi in #231
Full Changelog: v0.6.0...v0.6.6
v0.6.0
In this release, we added support for time series forecasting task and NLP model fine tuning. Also, we have made a large number of feature & performance improvements.
- data split by 'time' for time-ordered data, and by 'group' for grouped data.
- support parallel trials and random search in
AutoML.fit()
API. - support warm-start in
AutoML.fit()
by using previously found start points. - support constraints on training/prediction time per model.
- new optimization metric: ROC_AUC for multi-class classification, MAPE for time series forecasting.
- utility functions for getting normalized confusion matrices and multi-class ROC or precision-recall curves.
- automatically retrain models after search by default; options to disable retraining or enforce time limit.
- CFO supports hierarchical search space and uses points_to_evaluate more effectively.
- variation of CFO optimized for unordered categorical hps.
- BlendSearch improved for better performance in parallel setting.
- memory overhead optimization.
- search space improvements for random forest and lightgbm.
- make stacking ensemble work for categorical features.
- python 3.9 support.
- experimental support for automated fine-tuning of transformer models from huggingface.
- experimental support for time series forecasting.
- warnings to suggest increasing time budget, and warning to inform users there is no performance improvement for a long time.
Minor updates
- make log file name optional.
- notebook for time series forecasting.
- notebook for using AutoML in sklearn pipeline.
- bug fix when training_function returns a value.
- support fixed random seeds to improve reproducibility.
- code coverage improvement.
- exclusive upper bounds for hyperparameter type randint and lograndint.
- experimental features in BlendSearch.
- documentation improvement.
- bug fixes for multiple logged metrics in cv.
- adjust epsilon when time per trial is very fast.
Contributors
- @sonichi
- @qingyun-wu
- @int-chaos
- @liususan091219
- @Yard1
- @bnriiitb
- @su2umaru
- @eduardobull
- @sek788432
- @ekzhu
- @anshumandutt
- @yue-msr
- @sadtaf
- @fzanartu
- @dsbyprateekg
- @hanhanwu
- @PardeepRassani
- @gianpdomiziani
- @stepthom
- @anhnht3
- @zzheng93
- @flippercy
- @luizhemelo
- @nabalamu
- @lostmygithubaccount
- @suryajayaraman
v0.5.0
Major update:
- Online automl. For example, we support tuning online machine learning library vowpal wabbit.
Minor updates:
- log best model in mlflow
- utility functions to produce normalized confusion matrix and roc or precision-recall curves for each class in multi-class tasks