You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
* feat: wrapping around pl.to_onnx to export models to ONNX, still require testing
* feat: cleaned implementation of the to_onnx method
* fix: generation of input name, shape of input_batch for PastCov torch module
* feat: adding example of onnx usage in userguide
* update changelog
* fix: revert some changes
* fix: export to onnx for RNNModel
* feat: added a comment about RNNModel for onnx inference
* update changelog
* fix: address review comments
* update changelog
* update torch user guide
* update to_onnx
---------
Co-authored-by: dennisbader <[email protected]>
Copy file name to clipboardexpand all lines: CHANGELOG.md
+1
Original file line number
Diff line number
Diff line change
@@ -11,6 +11,7 @@ but cannot always guarantee backwards compatibility. Changes that may **break co
11
11
12
12
**Improved**
13
13
14
+
- Added ONNX support for torch-based models with method `TorchForecastingModel.to_onnx()`. Check out [this example](https://unit8co.github.io/darts/userguide/gpu_and_tpu_usage.html#exporting-model-to-onnx-format-for-inference) from the user guide on how to export and load a model for inference. [#2620](https://github.com/unit8co/darts/pull/2620) by [Antoine Madrona](https://github.com/madtoinou)
14
15
- Made method `ForecastingModel.untrained_model()` public. Use this method to get a new (untrained) model instance created with the same parameters. [#2684](https://github.com/unit8co/darts/pull/2684) by [Timon Erhart](https://github.com/turbotimon)
It is also possible to export the model weights to the ONNX format to run inference in a lightweight environment. The example below works for any `TorchForecastingModel` except `RNNModel` and for optional usage of past, future and / or static covariates. Note that all series and covariates must extend far enough into the past (`input_chunk_length)` and future (`output_chunk_length`) relative to the end of the target `series`. It will not be possible to forecast a horizon `n > output_chunk_length` without implementing the auto-regression logic.
357
+
358
+
```python
359
+
model = SomeTorchForecastingModel(...)
360
+
model.fit(...)
361
+
362
+
# make sure to have `onnx` and `onnxruntime` installed
363
+
onnx_filename ="example_onnx.onnx"
364
+
model.to_onnx(onnx_filename, export_params=True)
365
+
```
366
+
367
+
Now, to load the model and predict steps after the end of the series:
0 commit comments