[quote MacMagus]> For graphics LZW might not be appropriate, a scanned photo will not compress, and may even get larger (and slower).
Not true. In the hundreds of thousands of lzw tif files that I have worked with over the years, I have never seen an image get larger due to lzw compression. An image *might* not get significantly smaller using lzw compression, but in all likelihood it will.
Well, maybe you haven't seen these, but I have, with tiffs that were scanned photos in very high resolution with 16 bits per sample. With that kind of resolution, you really don't get repeated strings of 8 bit bytes that LZW needs to be effective. Besides being larger, they were a very slow.
LZW is a lossless compression algorithm, and it can't do nearly as well as something like JPEG where you can trade off quality for size.
All lossless compression algorithms have cases where the output might be larger than the input. If there was always a guarantee that a compression pass will get smaller, then you would be able to do multiple passes and get an infinite amount of data down to zero bits. Take a look at section 9.1 of the compression FAQ.
http://www.faqs.org/faqs/compression-faq/part1/
"It is mathematically impossible to create a program compressing without loss
*all* files by at least one bit (see below and also item 73 in part 2 of this
FAQ). Yet from time to time some people claim to have invented a new algorithm
for doing so. Such algorithms are claimed to compress random data and to be
applicable recursively, that is, applying the compressor to the compressed
output of the previous run, possibly multiple times. Fantastic compression
ratios of over 100:1 on random data are claimed to be actually obtained."