Question

I'm using Python 2.7.3 with numpy and pyfits to process scientific FITS files. I would like to work on the images at half or one quarter resolution for the sake of speed, and have this code:

# Read red image
hdulist = pyfits.open(red_fn)
img_data = hdulist[0].data
hdulist.close()
img_data_r = numpy.array(img_data, dtype=float)
# Scale it down to one quarter size
my=[]
for line in img_data_r[::4]:
    myline=[]
    for item in line[::4]:
        myline.append(item)
    my.append(myline)
img_data_r = my

This works, but I wonder if there is a faster, more native way to reduce the array. The reductions should happen as early as possible, the idea being that the data that will be processed is of minimal acceptable size. If there was a way of reading a reduced dataset with pyfits, that would be ideal. But such a method doesn't seem to exist (correct me if I'm wrong). How about numpy? Or scipy/math/anything else?

Was it helpful?

Solution

The data array you get from pyfits already is a NumPy array. You don't need to create one from it. Moerover, you can simply do the downsampling in a single step:

img_data_r = hdulist[0].data[::4, ::4]

This won't copy the data, but rather simply copy a new view with different strides. If you need the down-sampled image as a contiguous array, use numpy.ascontiguousarray().

This method of downsampling only keeps one in sixteen pixels, and completely drops the information in all the other pixels. If you need higher-quality downsampling, rather than doing it in your code, you are probably better off to downsample your FITS files using Imagemagick. This will also reduce the time it takes to read the files from disk.

To convert all your FITS files in the current directory in place (warning: big versions get overwritten), you could use

mogrify -resize 25% *.fits
Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top