Vertex AI returns a different result from the local tflite model

314 Views Asked by At

I uploaded my tflite model on Vertex AI and made an endpoint, and I requested inference with some input value, but it returns a different result from my local tflite model's inference result.

The input value is float32 array(actually sampled audio data) and I used this code for the request. Although there was the same input array, the local tflite model and the model which is uploaded on Vertex AI returns quite big different result.

Is there any possibility of distortion on the value while it transfers to the Vertex AI instance?

1

There are 1 best solutions below

1
Ricky Nguyen On

Keep in mind that the encoding for the requests are "numbers". There is no int or float. That's how JSON is defined, as well as the equivalent gRPC of it.

You could also try the rawPredict API instead of predict.