Releases: Nixtla/neuralforecast
Releases · Nixtla/neuralforecast
v3.1.2
Features
- [FEAT] Any horizon explanation @marcopeix (#1393)
Bug fixes
- [FIX] Remove torch fix @elephaint (#1392)
v3.1.1
Hotfix
- [FIX] Backwards compatibility with saved models breaks if "explain" attribute isn't present @elephaint (#1389)
v3.1.0
New features
- [FEAT] (Shapley) explanations for univariate forecast models @elephaint @marcopeix (#1377)
- [FEAT] Any horizon prediction @elephaint @JQGoh (#1368)
- [FEAT]: Training window filtering @marcopeix (#1344)
- [FEAT] xLSTM @elephaint (#1363)
Bug fixes
- [FIX]: torch version @marcopeix (#1380)
- [FIX] Fixes for using uv @elephaint (#1364)
- [FIX] Bugs ISQF @elephaint (#1358)
- Update BiTCN doc dropout to match code @LemuelKL (#1376)
- Adds support for static variables in predict_insample method @nasaul (#1349)
- rm unused
y_insamplearg from pytorch losses @deven367 (#1360)
General
- Dev environment installation using uv @JQGoh (#1362)
- Enhancements @deven367 (#1371)
- Codespace setup for Neuralforecast @JQGoh (#1382)
- migrate tests from nbs to pytest @deven367 (#1353)
Documentation
v3.0.2
Enhancements
- Distributional predictions in
predict_insample()@janovergoor (#1309) - Optimizations in tsdataset: reduce allocations for large datasets @tylernisonoff (#1335)
Fixes
- [FIX]: Add logic to load custom models when using ReduceLROnPlateau @marcopeix (#1340)
- [FIX]: Fixes incorrect cuts in conformal prediction with conformal_error @elephaint (#1331)
v3.0.1
Features
- FEAT: Select basis functions in NBEATS @tblume1992 @marcopeix (#1191)
- FEAT: Add flash-attention @LeonEthan (#1295)
- FEAT: HuberIQLoss @elephaint (#1307)
Bug Fixes
- FIX: Fix iPython version @elephaint (#1282)
- FIX: Recurrent predictions @elephaint (#1285)
- FIX: Fix poor performance with the NegativeBinomial DistributionLoss @JQGoh (#1289)
- FIX: Add exclude_insample_y param to TimeXer for model loading @marcopeix (#1306)
- FIX: Set 2.0.0<=pytorch<=2.6.0 to avoid conflicts with networkx with Python 3.9 @marcopeix (#1318)
- FIX: Create windows once @elephaint (#1325)
- FIX: Add h_train to RNNs & fix issue with input_size @elephaint (#1326)
- FIX: Allow static vars only with NBEATSx and exogenous block @marcopeix (#1319)
v3.0.0
New features
- FEAT: TimeXer @marcopeix (#1267)
- All losses compatible with all types of models (e.g. univariate/multivariate, direct/recurrent) OR appropriate protection added.
- DistributionLoss now supports the use of
quantilesinpredict, allowing for easy quantile retrieval for allDistributionLosses. - Mixture losses (GMM, PMM and NBMM) now support learned weights for weighted mixture distribution outputs.
- Mixture losses now support the use of
quantilesinpredict, allowing for easy quantile retrieval. - Improved stability of
ISQFby adding softplus protection around some parameters instead of using.abs. - Unified API for any quantile or any confidence level during predict for both point and distribution losses.
Enhancements
- [DOCS] Docstrings @elephaint (#1279)
- FIX: Minor bug fix in TFT and a nicer error message for fitting with the wrong val_size @marcopeix (#1275)
- [FIX] Adds bfloat16 support @elephaint (#1265)
- Recurrent models can now produce forecasts recursively or directly.
- IQLoss now gives monotonic quantiles
- MASE loss now works
Breaking Changes
- [FIX] Unify API @elephaint (#1023)
- RMoK uses the
revin_affineparameter instead ofrevine_affine. This was a typo in the previous version. - All models now inherit the
BaseModelclass. This changes how we implement new models in neuralforecast. - Recurrent models now require an
input_sizeparameter. TCNandDRNNare now window models, not recurrent models- We cannot load a recurrent model from a previous version to v3.0.0
Bug Fixes
- [FIX] Multivariate models give error when predicting when n_series > batch_size @elephaint (#1276)
- [FIX]: Insample predictions with series of varying lengths @marcopeix (#1246)
Documentation
- [DOCS] Update documentation @elephaint (#1274)
- [DOCS] Add example of modifying the default configure_optimizers() behavior (use of ReduceLROnPlateau scheduler) @JQGoh (#1015)
v2.0.1
Enhancements
- FEAT: Custom RNN layers for TFT @Yanam24 (#1230)
- FEAT: Add the horizon weighing to the distribution losses @mwamsojo (#1233)
Documentation
- DOCS: Add citation note @elephaint (#1244)
- fix: azul @AzulGarza (#1245)
v2.0.0
v1.7.7
v1.7.6
New Features
- [FEAT]: Support providing DataLoader arguments to optimize GPU usage @jasminerienecker (#1186)
- [FEAT]: Set activation function in GRN of TFT @marcopeix (#1175)
- [FEAT]: Conformal Predictions in NeuralForecast @JQGoh (#1171)
Bug Fixes
- [FIX]: Ability load models saved using versions before 1.7 @tylernisonoff (#1207)
- [FIX]: Conformal prediction issues @elephaint (#1179)
- [FIX]: Feature importance when using only hist_exog in TFT fails @elephaint (#1174)
- [FIX]: Remove unused output layer NBEATSx @elephaint (#1168)
- [FIX]: Fix Tweedie loss @elephaint (#1164)
- [FIX]: MLPMultivariate incorrect static_exog parsing @elephaint (#1170)
- [FIX]: Deprecate activation functions for GRU @marcopeix (#1198)
Documentation
- [DOC]: Tutorial on cross-validation @marcopeix (#1176)
- [DOC]: Build docs on release only @elephaint (#1183)