-
|
The paper about DoReFa-Net present 3 quantization steps ( gradient, weight, and [activation || output ]). Seems that only |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment
-
|
Exactly, weight & output & gradient are all quantized in paper. And both |
Beta Was this translation helpful? Give feedback.
Exactly, weight & output & gradient are all quantized in paper. And both
weightandoutputquantization have been supported in NNI'sDoReFa-Netcurrently. We think the priority of implementing gradient quantization is low sinceDoReFa-Netis not very suitable for deployment when comparing to another training aware quantization algorithm such as QAT.