- 
                Notifications
    
You must be signed in to change notification settings  - Fork 2.1k
 
Open
Description
There is inconsistency between outputs produced by MNN and ONNXRuntime for Deconvolution op with manually specified output shape
ONNX model (single ConvTranspose layer)
MNN model (converted from ONNX model with 3.2.5 converter)
Compare script:
import numpy as np
import MNN # version 3.2.5
import onnxruntime as rt # version 1.23.2
if __name__ == '__main__':
	input_data = np.random.rand(2, 3, 8, 8).astype(np.float32)
	ort_model = rt.InferenceSession('deconv.onnx', providers=rt.get_available_providers())
	ort_output = ort_model.run(['output'], {'input' : input_data})[0]
	mnn_model = MNN.nn.load_module_from_file('deconv.mnn', ['input'], ['output'])
	mnn_input = MNN.expr.placeholder(input_data.shape, MNN.expr.NCHW)
	mnn_input.write(input_data)
	mnn_output = mnn_model.forward([mnn_input])[0].read()
	print(np.allclose(ort_output, mnn_output, rtol=1e-4, atol=1e-4))
	np.save('input.npy', input_data)
	np.save('output_ort.npy', ort_output)
	np.save('output_mnn.npy', mnn_output)
After some investigation it turned out that current MNN output is the same as ONNXRuntime before these changes:
microsoft/onnxruntime@6246662
microsoft/onnxruntime@f96f222
Also checked that OpenVINO output is the same as in current ONNXRuntime version.
Metadata
Metadata
Assignees
Labels
No labels