Leveraging AWS SageMaker Serverless Inference for Customized Model Serving --- Issue in endpoint creation

24 Views Asked by At

I am replicating this article :

https://medium.com/picus-security-engineering/customized-model-serving-via-aws-sagemaker-serverless-inference-a72879948321#b008

I have "model.tar.gzip" file in s3 bucket with the same hierarchy as explained in the image and article.

Tar file structure

When I try to create and endpoint using code it is struck at creating status and shows 'tarfile.ReadError: empty file' error in cloudwatch:

   "source": [
    "endpoint_name = 'DEMO-modelregistry-endpoint-' + str(round(time.time()))\n",
    "print(\"EndpointName={}\".format(endpoint_name))\n",
    "\n",
    "create_endpoint_response = sm_client.create_endpoint(\n",
    "    EndpointName=endpoint_name,\n",
    "    EndpointConfigName=endpoint_config_name)\n",
    "print(create_endpoint_response['EndpointArn'])"
   ]
0

There are 0 best solutions below