Handling Image Size Adjustment Errors in FastSAM for Object Segmentation

19 Views Asked by At

I am using FastSAM (model file: FastSAM-x.pt) for object segmentation in a project where the input images vary in size. FastSAM attempts to adjust image sizes to be multiples of its maximum stride (32) but occasionally fails, leading to errors.

Here's the typical warning and adjustment process:

WARNING ⚠️ imgsz=[453] must be multiple of max stride 32, updating to [480]

The model automatically updates the image size and usually continues without issues. However, when the original image size is close to a multiple of 32 but not exactly a multiple, FastSAM throws a runtime error after processing hundreds of images successfully. For example:

Image Shape before SAM: (374, 144, 3)
WARNING ⚠️ imgsz=[374] must be multiple of max stride 32, updating to [384]
**RuntimeError**: expand(torch.FloatTensor{[6]}, size=[]): the number of sizes provided (0) must be greater or equal to the number of dimensions in the tensor (1)

Interestingly, manually resizing the image to the suggested dimensions (e.g., 384 in this case) before passing it to the model does not solve the issue, and the error persists. Has anyone encountered a similar problem or has a solution to this issue? Any insights or suggestions would be greatly appreciated.

0

There are 0 best solutions below