Triton inference server does not have onnx backend

35 Views Asked by At

nvcr.io/nvidia/tritonserver:24.02-py3 this image doesn't have onnx backend

i have been following this tutorial "https://github.com/triton-inference-server/tutorials/tree/main/Conceptual_Guide/Part_1-model_deployment#setting-up-the-model-repository"

when I hit below command:

 docker run -it --shm-size=256m --rm -p8000:8000 -p8001:8001 -p8002:8002 -v $(pwd)/model_repository:/models nvcr.io/nvidia/tritonserver:24.02-py3

and went inside container i got this error: image on further checking this i found that onnx is missing image

if this is legit issue then can you please guide me on how to install onnx on this container? or any better approach to fix this?

0

There are 0 best solutions below