React Native mobilenet always makes the same prediction

27 Views Asked by At

I am working on a native react expo app that uses the tensorflow mobilenet model.

I am able to load the model and make a prediction on an image. Regardless of the image supplied the predictions are always the same.

Let me know if anyone has been able to resolve a similar issue.

output - the same for any image

[{"className": "tench, Tinca tinca", "probability": 0}, {"className": "goldfish, Carassius auratus", "probability": 0}, {"className": "great white shark, white shark, man-eater, man-eating shark, Carcharodon carcharias", "probability": 0}]

tensorflow imports

import * as tf from '@tensorflow/tfjs';
import { fetch, decodeJpeg } from '@tensorflow/tfjs-react-native';
import * as mobilenet from '@tensorflow-models/mobilenet';

code to classify image

useEffect(() => {
  const fetchData = async () => {
    await tf.ready()
    const model = await mobilenet.load();
    // console.log(model);
    const image = require('./cat_224_224.jpg');
    const imageAssetPath = Image.resolveAssetSource(image);
    const response = await fetch(imageAssetPath.uri, {}, { isBinary: true });
    const imageDataArrayBuffer = await response.arrayBuffer();
    const imageData = new Uint8Array(imageDataArrayBuffer);
    const imageTensor = decodeJpeg(imageData);
    const floatImage = tf.cast(imageTensor, 'float32');
    const predictions = await model.classify(floatImage);
    console.log(predictions);
  }
  fetchData()
    .catch(console.error);
}, [])
0

There are 0 best solutions below