Question

I build an application that handles a lot of scanned images. These images are imported using JPEGTIFF files. I split the TIFF files to single page LZWTIFF files because the application cant handle JPEGTIFF files. The LZW files are a lot larger that the original JPEGTIFF files and are loading really slow. I’m wondering if i should switch to using single page JPEG files instead of the LZW files. I’m a little scared to do this because JPEG is a lossy compression and LZW is not. But I'm not sure if i gain any quality by going from JPEGTIFF to LZWTIFF. So my question is do i lose quality switching to JPEG instead of LZWTIFF?

Was it helpful?

Solution

By definition if you are switching from a lossless compression algorithm to a lossy one you are potentially going to lose quality - though the fact that you're starting from something called "JPEGTIFF" implies there might already be some compression in there. However, there are other factors to take into account which will minimise the issues.

  1. Increasing the quality of the JPEG compression. You can control how much detail is lost by changing the Quality value of the compression. The higher the value, the less data is lost - but obviously this doesn't gain you as much in other areas (disk space, load times etc.)

  2. Photographs can cope better with lossy compression. Compression artefacts are most noticeable when you have hard edges - such as text, lines and blocky areas that are in manufactured images. A photo of a real scene has less of this and so compression artefacts are less noticeable.

However, you really need to do some tests to see if the quality of images produced is acceptable to your customer.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top