I am developing a program to read a time series of NIfTY format images to a 4D matrix in MATLAB. There are about 60 images in the stack and the program runs without problems until the 28th image. (All the images are approximately same size, same details) But after that the reading get slower and slower.
In fact, the delay is accumulating. I checked the program again and there are no open files. Everything looks fine.
Can someone give me an advice?
Size of current array (double)
Unless you are running on a machine with more than ~20GB RAM memory your matrix simply becomes too large to handle.
To check the size of the first three dimensions of your matrix:
Output:
Now multiply by 60 to obtain the size of your 4D matrix and divide by 1024^3 to obtain GB's:
So yes, your matrix is most likely too large to handle efficiently/effectively.
A matrix exceeding your RAM memory forces MatLab to use the swap file (HDD/SSD) which is orders of magnitude slower than your random access memory (even if you have a SSD).
Switch to different data types
I you do not require double precision, i.e. 16 digits of accuracy, you can always switch to less digits, i.e. single precision floating point numbers. By doing this you can reduce size. You can even reduce size further is the numbers are for example unsigned integers in the range of 0-255. See code below:
Output:
Which may just fit inside your RAM. Make sure you indeed pre-allocate memory first, i.e. create an empty matrix using the zeros() function.