How do I load a YOLO model when my .pt file is stored in firebase storage?

189 Views Asked by At

I have run some machine learning tasks locally to detect objects in images. To do so I have stored .pt files locally. When deploying I realized these were too large to deploy, at least according to Vercel. I therefore decided to store the .pt files in the cloud with firebase storage. My approach now looks like this:

bucket = storage.bucket()

# Create a temporary file
with tempfile.NamedTemporaryFile() as temp_file:
    # Download the model file from Firebase Storage
    blob = bucket.blob("yolov8n.pt")
    blob.download_to_filename(temp_file.name)

    # Load the model
    model = YOLO(temp_file.name)

as opposed to the previous

model = YOLO("/localpath.pt")

When doing this I get the following error though:

 WARNING ⚠️ Unable to automatically guess model task, assuming 'task=detect'. Explicitly define task for your model, i.e. 'task=detect', 'segment', 'classify', or 'pose'.
[1] [2023-12-20 16:09:51,035] ERROR in app: Exception on /api/analyse [POST]
[1] Traceback (most recent call last):
[1]   File "/Library/Frameworks/Python.framework/Versions/3.11/lib/python3.11/site-packages/flask/app.py", line 2525, in wsgi_app
[1]     response = self.full_dispatch_request()
[1]                ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
[1]   File "/Library/Frameworks/Python.framework/Versions/3.11/lib/python3.11/site-packages/flask/app.py", line 1822, in full_dispatch_request
[1]     rv = self.handle_user_exception(e)
[1]          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
[1]   File "/Library/Frameworks/Python.framework/Versions/3.11/lib/python3.11/site-packages/flask/app.py", line 1820, in full_dispatch_request
[1]     rv = self.dispatch_request()
[1]          ^^^^^^^^^^^^^^^^^^^^^^^
[1]   File "/Library/Frameworks/Python.framework/Versions/3.11/lib/python3.11/site-packages/flask/app.py", line 1796, in dispatch_request
[1]     return self.ensure_sync(self.view_functions[rule.endpoint])(**view_args)
[1]            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
[1]   File "/Users/name/Desktop/name-project/api-name/api/index.py", line 63, in analyse
[1]     results = model(image)
[1]               ^^^^^^^^^^^^
[1]   File "/Library/Frameworks/Python.framework/Versions/3.11/lib/python3.11/site-packages/ultralytics/engine/model.py", line 98, in __call__
[1]     return self.predict(source, stream, **kwargs)
[1]            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
[1]   File "/Library/Frameworks/Python.framework/Versions/3.11/lib/python3.11/site-packages/ultralytics/engine/model.py", line 232, in predict
[1]     self.predictor.setup_model(model=self.model, verbose=is_cli)
[1]   File "/Library/Frameworks/Python.framework/Versions/3.11/lib/python3.11/site-packages/ultralytics/engine/predictor.py", line 317, in setup_model
[1]     self.model = AutoBackend(model or self.args.model,
[1]                  ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
[1]   File "/Library/Frameworks/Python.framework/Versions/3.11/lib/python3.11/site-packages/torch/utils/_contextlib.py", line 115, in decorate_context
[1]     return func(*args, **kwargs)
[1]            ^^^^^^^^^^^^^^^^^^^^^
[1]   File "/Library/Frameworks/Python.framework/Versions/3.11/lib/python3.11/site-packages/ultralytics/nn/autobackend.py", line 97, in __init__
[1]     fp16 &= pt or jit or onnx or xml or engine or nn_module or triton  # FP16
[1]     ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
[1] TypeError: unsupported operand type(s) for &=: 'bool' and 'str'

I first thought this was because a compatibility issue of some kind, as ChatGPT suggested that might be the case. I therefore did a deep dive and found no compatibility issues. I think the problem here is that I have never used the tempfile module before and I am doing something wrong, or that I have a fundamental misunderstanding of how .pt files function.

0

There are 0 best solutions below