Open
Description
Ask a Question
Hello all,when I convert a tensorflow QAT model into onnx with tf2onnx, there is a error:
raise ValueError("make_sure failure: " + error_msg % args) ValueError: make_sure failure: Unable to convert node FakeQuantWithMinMaxArgs with num_bits=10
So I wonder we haven`t support this bitwidth so far? Or I mde some mistakes?
thanks!