Parsing ONNX to TensorRT engine on Manifold 2G
781 0 2022-1-26
Uploading and Loding Picture ...(0/1)
o(^-^)o
AshWalker
lvl.1

United States
Offline

I'm trying to generate a TensorRT engine from an ONNX model on a  Manifold 2-G. An ONNX parsing library is included in recent versions of  TensorRT, but the Manifold only provides TensorRT version 4, which can  only parse Caffe models. Because TensorRT engines are unique to the  machine on which they're generated, I can't just generate the engine on a  machine with more recent versions of libraries. So: is there a way to  update TensorRT on the Manifold, and, if not, is there another way to  convert an ONNX model to a TensorRT engine on the Manifold?

The most promising route so far seems to be onnx-tensorrt (https://github.com/onnx/onnx-tensorrt/tree/v5.0),  which is a standalone version of the TensorRT ONNX parsing library. It  should work with TensorRT v4, but I'm having trouble building it on the  Manifold, so I'm hoping for an easier solution.

Because the  original model was made in PyTorch, I considered trying to use PyTorch  directly instead of TensorRT, but, as far as I can tell, the Manifold  doesn't have a recent enough version of Python for that to work, and I  would have to write new code to support it.
2022-1-26
Use props
Advanced
You need to log in before you can reply Login | Register now

Credit Rules