This section contains notebooks that demonstrate model deployment and optimization techniques.
| Notebook | Description |
|---|---|
openvino_quantization.ipynb |
Model compression using NNCF for OpenVINO deployment |
If you have not installed all required dependencies, follow the Installation Guide.
This notebook demonstrates how NNCF can be used to compress a model trained with Anomalib. The notebook is divided into the following sections:
- Train an anomalib model without compression
- Train a model with NNCF compression
- Compare the performance of the two models (FP32 vs INT8)