Debugging a Custom Handler

184 Views Asked by At

I am very new to Torch Serving so please ignore my beginner ignorance.

I am trying to test out a CustomHandler, which in the initialize method receives a Context object.

The context file needs to be provided into the Initialize function.

Now, I am able to create a .mar file using the torch-model-archiver, and also deploy it to a URL using the torchserve --start

However, when I submit requests to it like:

curl http://localhost:8080/predictions/my_tc/1.0 -T input_file.txt

It just hangs. No output.

  • The ping API return Healthy for most of the time, and I assume after some time it hits a timeout and starts saying unhealthy.
  • I do see these errors popping up on the logs, I am not sure where they are coming from:
 W-9009-my_tc_1.0-stdout MODEL_LOG - Backend worker process died.
my_tc_1.0-stdout MODEL_LOG - Traceback (most recent call last):
my_tc_1.0-stdout MODEL_LOG -   File ".../site-packages/ts/model_service_worker.py", line 250, in <module>
my_tc_1.0-stdout MODEL_LOG -     worker = TorchModelServiceWorker(
my_tc_1.0-stdout MODEL_LOG -   File ".../site-packages/ts/model_service_worker.py", line 69, in __init__
my_tc_1.0-stdout MODEL_LOG -     self.port = str(port_num) + LOCAL_RANK
my_tc_1.0-stdout MODEL_LOG - Traceback (most recent call last):
my_tc_1.0-stdout MODEL_LOG - TypeError: can only concatenate str (not "int") to str

  • How can I provide the Context file via the command line?

I suppose my questions are:

  • When I do a curl http://localhost:8080/predictions/my_tc/1.0 -T input_file.txt What method is called? Which method in the Custom Handler or Base Handler is receiving the input file?
  • What should the input file look like? I think somewhere it was mentioned a Json?
  • Any examples which create the Context file and then pass it for serving?

Thanks a lot!

0

There are 0 best solutions below