TensorFlow_Hub saved model online accessibility?

24 Views Asked by At

I've followed the TensorFlow_Hub documentation to retrain one of their models to identify one of 20 cat breeds from an image. I'm doing this for school, and part of it requires me making it available to them, and due to file size restrictions, I can't just give them the handle to where I have it stored in the git repo (via git lfs) because it says the file doesn't exist, even though it does: Model in root of repo

Code attempting to load this model directory: model = tfh.load('tmp\cat_breed_model_efficientnetv2-xl-21k')

This is a python application hosted via streamlit cloud, and I'm using the TensorFlow_Hub.load() function.

When I run this code locally via PyCharm, it works just fine, so I'm assuming it has to do with the way git lfs stores the large model file within that folder?

Now I'm trying to host the model on a google cloud bucket, but I'm seeing things about directories not being loadable via url? This confuses me because it seems that's exactly what the tensorflow_hub models are stored on to begin with? I have the permissions for the whole bucket set to public, but even when I search the bucket/object url in a browser it doesn't work:

Google cloud bucket repo search result

Cloud bucket object/permission

Basically, I have the model, and all the rest of the code runs just fine once the model actually loads, so can anyone point me towards a way I can host my model directory so that it can be loaded at runtime?

**Additional note, I'm using streamlit cloud and linked my repo to get it to run, I just need it to be able to access the model directory.

I've not done a lot of this before, but I think I'm very close, please be kind!

0

There are 0 best solutions below