np.diff on big-endian data seems slow

24 Views Asked by At

I am selecting arrays from a fits file like so:

import numpy as np
from astropy.io import fits

path = "cube.fits"
f = fits.open(path)
cube = f[0].data

print(cube.shape)

out: (4608, 3072, 3072)

I've noticed that the dtype of this cube is big-endian:

print(cube.dtype)

out: dtype('>f4')

So when I attempt to grab one array from the cube and perform numpy operations, from some stack browsing, I can only assume it's initially converting to little-endian (is that true?). And as such when attempting np.diff, the process seems very slow:

test_data = cube[:,np.random.randint(3072),np.random.randint(3072)]
diff_array = np.diff(test_data)

Attempting to convert the entire cube using cube = f[0].data.astype(np.float32) also seems like it will be an exceptionally long process given how long diff_array takes to compute.

Is this common behaviour with no workarounds or am I mishandling these operations entirely and my hypothesis that this is due to endian-ness conversions is incorrect?

Any insight into this would be really appreciated, thank you!

0

There are 0 best solutions below