Skip to content

A tutorial project demonstrating how to use the TensorRT C++ API

License

Notifications You must be signed in to change notification settings

xioaxin/tensorrt-cpp-api

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

19 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

tensorrt-cpp-api

A tutorial project demonstrating how to use the TensorRT C++ API

  • Explain that the model must have a dynamic batch size when exported from onnx.
  • Explain motiviation for this project is shitty docs.
  • They need to provide their own model sadly.

Stargazers Issues LinkedIn


logo

TesnorRT C++ API Tutorial

How to use TensorRT C++ API for high performance GPU inference.
A Venice Computer Vision Presentation

· · · Venice Computer Vision

TensorRT C++ Tutorial

This project demonstrates how to use the TensorRT C++ API for high performance GPU inference. It covers how to do the following:

Getting Started

The following instructions assume you are using Ubuntu 20.04

Prerequisites

  • sudo apt install build-essential
  • sudo apt install python3.8
  • pip3 install cmake

TODO Install TensorRT

Building the library

  • mkdir build && cd build
  • cmake ..
  • make -j$(nproc)
  • make install

About

A tutorial project demonstrating how to use the TensorRT C++ API

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • C++ 78.5%
  • CMake 21.5%