Setup Triton Inference Server with TensorRT-LLM Backend on Kubernetes Quick Install You can install Triton Inference Server by running the install.sh script. ./install.sh