Export onnx or JIT models for deployment Run pip install onnx -U. Export GPT Run python examples/onnx/exporter.py --gpt Export other models Run python examples/onnx/exporter.py --decoder --vocos Reference Run LLMs on Sophon TPU