Skip to content

[Crash] Use @onnx_op to parse a custom operator for the model, and program crashed in inference #867

Open
@Liuhehe2019

Description

@Liuhehe2019

This is my custom operator definition:

@onnx_op(op_type='LocatePostNms',
domain='ai.onnx.contrib',
inputs=[PyCustomOpDef.dt_float, PyCustomOpDef.dt_int32, PyCustomOpDef.dt_float, PyCustomOpDef.dt_float,
PyCustomOpDef.dt_int32, PyCustomOpDef.dt_float],
outputs=[PyCustomOpDef.dt_float, PyCustomOpDef.dt_int32],
attrs={"classNumber": PyCustomOpDef.dt_int32,
"iou_thresh": PyCustomOpDef.dt_float,
"scale": PyCustomOpDef.dt_double,
"side": PyCustomOpDef.dt_int32,
"topk": PyCustomOpDef.dt_int32,
"type": PyCustomOpDef.dt_int32}
)
def locatepostnms_compute(p0, l0, b0, p1, l1, b1, **kwargs):
classNumber = kwargs['classNumber']
iou_thresh = kwargs['iou_thresh']
scale = kwargs['scale']
side = kwargs['side']
topk = kwargs['topk']
type = kwargs['type']
padded_box, result_cnt = cvt_compute(x)
return padded_box, result_cnt

I just using:
sess_options = rt.SessionOptions()
sess_options.register_custom_ops_library(onnxruntime_extensions.get_library_path())
to register the custom operator, and it works

but when comes to
rt.InferenceSession().run()

the program crashed and reported: Process finished with exit code -1073740791 (0xC0000409)

I have no idea with this issue, could you give me some suggesstion about that?

By the way, how to quantize this model with custom op in onnxruntime python?
onnxruntime. quantization. quantize. quantize_static report error:
onnx.onnx_cpp2py_export.shape_inference.InferenceError: [TypeInferenceError] Cannot infer type and shape for node name 453. No opset import for domain ai.onnx.contrib optype LocatePostNms

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions