nvcr.io/nvidia/tritonserver:24.02-py3 this image doesn't have onnx backend
i have been following this tutorial "https://github.com/triton-inference-server/tutorials/tree/main/Conceptual_Guide/Part_1-model_deployment#setting-up-the-model-repository"
when I hit below command:
docker run -it --shm-size=256m --rm -p8000:8000 -p8001:8001 -p8002:8002 -v $(pwd)/model_repository:/models nvcr.io/nvidia/tritonserver:24.02-py3
and went inside container i got this error:
on further checking this i found that onnx is missing
if this is legit issue then can you please guide me on how to install onnx on this container? or any better approach to fix this?