org.pytorch.serve.wlm.WorkerInitializationException: Backend stream closed

190 Views Asked by At

I see this whenever I start any type of model store. The logs look like:

org.pytorch.serve.wlm.WorkerInitializationException: Backend stream closed. at org.pytorch.serve.wlm.WorkerLifeCycle.startWorker(WorkerLifeCycle.java:173) ~[model-server.jar:?] at org.pytorch.serve.wlm.WorkerThread.connect(WorkerThread.java:339) ~[model-server.jar:?] at org.pytorch.serve.wlm.WorkerThread.run(WorkerThread.java:183) ~[model-server.jar:?] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144) ~[?:?] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642) ~[?:?] at java.lang.Thread.run(Thread.java:1623) [?:?]

I am not able to find any related information about this.

I tried running the example from here https://github.com/pytorch/serve/blob/master/docs/inference_api.md.

0

There are 0 best solutions below