model.predict.datasync() takes time to give results which causes lag in camera. How to get results / predictions instantly?

162 Views Asked by At

I'm using Tensorflow js in react native and I'm getting the correct predictions for my model but it takes a lot of time to give results. For eg I'm using a custom model created by me in teachable machine by Google. But the .datasync() takes time approx. 1 second whole to give results. This causes a physical lag in the camera I want to get results instantly. This is my code below: -

<TensorCamera
          style={styles.camera}
          flashMode={Camera.Constants.FlashMode.off}
          type={Camera.Constants.Type.back}
          resizeWidth={224}
          resizeHeight={224}
          resizeDepth={3}
          onReady={handleCameraStream}
          autorender={true}
        />
//
const handleCameraStream = (imageAsTensors) => {
    try {
    } catch (e) {
      // console.log("Tensor 1 not found!");
    }
    const loop = async () => {
      // && detected == true
      if (model !== null) {
        if (frameCount % makePredictionsEveryNFrames === 0) {
          const imageTensor = imageAsTensors.next().value;
          await getPrediction(imageTensor);
          // .catch(e => console.log(e));
        }
      }

      frameCount += 1;
      frameCount = frameCount % makePredictionsEveryNFrames;
      requestAnimationFrameId = requestAnimationFrame(loop);
    };
    loop();

    //loop infinitely to constantly make predictions
  };
//
const getPrediction = async (tensor) => {
    // if (!videoLink) {
    if (!tensor) {
      console.log("Tensor not found!");
      return;
    }
    //
    const imageData2 = tensor.resizeBilinear([224, 224]);
    // tf.image.resizeBilinear(tensor, [224, 224]);
    const normalized = imageData2.cast("float32").div(127.5).sub(1);
    const final = tf.expandDims(normalized, 0);
    //
    console.time();
    const prediction = model.predict(final).dataSync();
   
    console.timeEnd();
    console.log("Predictions:", prediction);
}

I heard about using .data() instead of .datasync() but I don't know how to implement .data() in my current code. please help.

1

There are 1 best solutions below

4
Vladimir Mandic On

predict is what takes time - and that is really up to your model
maybe it can run faster on different backend (no idea which backend you're using, default for browsers would be webgl), but in reality it is what it is without rearchitecting the model items.

datasync simply downloads results from wherever tensors are (e.g. in gpu vram) to your variable in js.

yes, you could use data instead which is an async call, but difference is couple of ms at best - its not going to speed up model execution at all.

btw, you're not releasing tensors anywhere - your application has some serious memory leaks.