This sample is a demonstration on how to use Torch-TensorRT runtime library libtorchtrt_runtime.so
along with plugin library libtorchtrt_plugins.so
In this demo, we convert two models ConvGelu
and Norm
to TensorRT using Torch-TensorRT python API and perform inference using torchtrt_runtime_example
. In these models, Gelu
and Norm
layer are expressed as plugins in the network.
The following command will generate conv_gelu.jit
and norm.jit
torchscript modules which contain TensorRT engines.
python network.py
The main goal is to use Torch-TensorRT runtime library libtorchtrt_runtime.so
, a lightweight library sufficient enough to deploy your Torchscript programs containing TRT engines.
- Download releases of LibTorch and Torch-TensorRT from https://pytorch.org and the Torch-TensorRT github repo and unpack both in the deps directory.
cd examples/torch_tensorrtrt_example/deps
// Download latest Torch-TensorRT release tar file (libtorch_tensorrt.tar.gz) from https://github.com/pytorch/TensorRT/releases
tar -xvzf libtorch_tensorrt.tar.gz
unzip libtorch-cxx11-abi-shared-with-deps-[PYTORCH_VERSION].zip
If TensorRT is not installed on your system / in your LD_LIBRARY_PATH then do the following as well
cd deps
mkdir tensorrt && tar -xvzf <TensorRT TARBALL> --directory tensorrt --strip-components=1
cd ..
export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:$(pwd)/deps/torch_tensorrt/lib:$(pwd)/deps/libtorch/lib:$(pwd)/deps/tensorrt/lib:/usr/local/cuda/lib
This gives maximum compatibility with system configurations for running this example but in general you are better off adding -Wl,-rpath $(DEP_DIR)/tensorrt/lib
to your linking command for actual applications
- Build and run
torchtrt_runtime_example
torchtrt_runtime_example
is a binary which loads the torchscript modules conv_gelu.jit
or norm.jit
and runs the TRT engines on a random input using Torch-TensorRT runtime components. Checkout the main.cpp
and Makefile
file for necessary code and compilation dependencies.
To build and run the app
cd examples/torchtrt_runtime_example
make
# If paths are different than the ones below, change as necessary
export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:$(pwd)/deps/torch_tensorrt/lib:$(pwd)/deps/libtorch/lib:$(pwd)/deps/tensorrt/lib:/usr/local/cuda/lib
./torchtrt_runtime_example $PWD/examples/torchtrt_runtime_example/norm.jit