-
Notifications
You must be signed in to change notification settings - Fork 11
Description
Hi all,
Currently I have an onnx model file
However, it contains one operator which is from mmcv as following:
After executing ./tensorrt_rtx --onnx=srx2_fp32.onnx --saveEngine=srx2_RTX11_fp32.trt --computeCapabilities="89"
It shows the errors like:
Op type MMCVModulatedDeformConv2d is not supported in TensorRT-RTX
If I manually modify "MMCVModulatedDeformConv2d" into "DeformConv", which is supported by onnxruntime since opset=19, it will show the following errors:
[11/05/2025-03:26:33] [V] [TRT] Parsing node: DeformConv_1801 [DeformConv]
[11/05/2025-03:26:33] [V] [TRT] Searching for input: x.47
[11/05/2025-03:26:33] [V] [TRT] Searching for input: offset
[11/05/2025-03:26:33] [V] [TRT] Searching for input: mask
[11/05/2025-03:26:33] [V] [TRT] Searching for input: model.generator.deform_align.backward_1.weight
[11/05/2025-03:26:33] [V] [TRT] Searching for input: model.generator.deform_align.backward_1.bias
[11/05/2025-03:26:33] [V] [TRT] DeformConv_1801 [DeformConv] inputs: [x.47 -> (1, 96, 120, 160)[FLOAT]], [offset -> (1, 288, 120, 160)[FLOAT]], [mask -> (1, 144, 120, 160)[FLOAT]], [model.generator.deform_align.backward_1.weight -> (48, 96, 3, 3)[FLOAT]], [model.generator.deform_align.backward_1.bias -> (48)[FLOAT]],
I build the custom operators as .so file and add it into onnxruntime sessionOptions and feed into InferenceSession.
And then I can infer the onnx model.
Does TensorRT_RTX support this or is there any methods to handle this?
Thanks