Depends on source. If you’re getting random jpegs from users for example you can definitely save 30-50% on the average. There are some very good compression tools out there these days
Not the answer to your original question but if you want to reduce your image file size without losing much (or any) quality, I'd recommend using either Google's algorithm called Guetzli or imagemagick. Either one of these should reduce your file size by quite a large amount. I usually run them at 85% quality and don't notice a difference.
JPG and PNG's need to be optimized because their formats are basically "antique" at this point, and most encoders, even the ones we have now, are prioritizing speed over file size.
For example, you can pass your JPGs trough Google's Guetzli - https://github.com/google/guetzli, it will provide insanely good results, but obviously insanely slow (some large images can take 10minutes++ to optimize).
I can't speak for JPEG2000 and XR, but at least WebP was designed from the ground up with new optimization techniques.
I don't believe there's another encoder/optimizer, other than Google's, for WebP right now, so you can't exactly further optimize them - unless you count decreasing the Quality of the image.
> Google's Guetzli - https://github.com/google/guetzli
Wow, I could have used this for my class! I had trouble shrinking my image file size without resorting to shrinking the ppi and resolution :/
It's non-destructive if you supply two different filenames as shown below.
guetzli [--quality Q] [--verbose] original.jpg output.jpg
I copied that example from https://github.com/google/guetzli
It took longer than I expected to compress photos that were 3 to 5 MB in size but I'm pleased with the results.
That's a good point! But also consider the CPU cost:
> I tried compressing a 7,8MB JPG with --quality 84 and it took nearly 20 minutes. > I also tried a 1,4MB JPG with --quality 85 and it took nearly 10 minutes. > I am on Ubuntu 16.04 LTS, intel core i7-4790K CPU @ 4.00GHz
https://github.com/google/guetzli/issues/50
I'd say storage would be cheaper. Or a third-party compression may provide better cost savings.... all depends on the situation!
I agree on the PNG optimization part, zopflipng is the best. And it is lossless, unless you use color space optimization. This can reduce the PNG files even further, using first e.g. pngquant and then zopflipng. There is a tool called <strong>Crunch</strong>, to automate this PNG optimizations.
Now for the jpeg, guetzli looks interesting, but it is a lossy compression. You compress jpeg, which already has some artefacts, for the second time. There will be more artefacts.
For jpeg there is another way to reduce the file size - progressive jpeg. That's lossless. You can convert it back to baseline and there is no loss. For command line you can use jpegtran -copy all -progressive input.jpg > output.jpg
and to revert to optimized baseline jpegtran -copy all -optimize input.jpg > output.jpg
.
Funny fact: guetzli produces only baseline jpegs, making it progressive afterwards saves even more!
Let's take some random picture. It is 5195x3269 and it is already progressive. It has been compressed with quality 73 as evidenced by identify -format '%Q' file.jpg
. Guetzli cannot be used with quality less than 84, it says:
Guetzli should be called with quality >= 84, otherwise the
output will have noticeable artifacts. If you want to
proceed anyway, please edit the source code.
Guetzli processing failed
OK, we use 84
​
Original size: 2,994,140 (progressive)
Baseline optimized: 3,066,729
Progressive: 2,989,290
Recompress 84 baseline: 2,766,614
Recompress 84 progressive: 2,702,589
Guetzli 84:
Guetzli 84 progressive:
...it takes too long, will post tomorrow
I want to re-emphasize this to anyone reading, because it's really goddamn important: Go online and look at some of the comparisons.
What Google is doing isn't a downsample, and it isn't a reduced-fidelity re-compression.
What they're doing is trading a fast, minimum-returns compression algorithm (great for a phone!) for a slow, maximum-returns one (only possible on a server!).
The actual difference in image quality is negligible.
When the show started they thought about what could be done but hasn't been figured out. It's possible to get better lossless compression but when noone knows.
However last year this happened https://github.com/google/guetzli/ Google figured out how to get 20-30% more compression out of jpg images (which is lossy). So I guess it did happen just lossy instead of lossless. Lossless is more harder and means the compression doesn't lose data. You don't need an image to be pixel perfect but you need a file to be byte perfect
No, the encoder is based on mozjpeg with jpeg-archive implementing several different evaluation methods (rather than only butteraugli).
There is some discussion here comparing them, apparantly Guetzli is only optimized for their own scoring method that doesn't necessarily reflect how humans score the image quality.
But most importantly Guetzli is super slow while mozjpeg gives you similar (or even better) quality with good performance which fits much better into a Lightroom workflow.
Nothing is forever. Never did believe it in the beginning, as 'store a lifetime of photos' was a clause with asterisks** (limited to 16Mpix; lossy compression using their Guetzli Perceptual JPEG encoder )
> I did some researches, I found notably the new Facebook compression algorithm, zstd. What do you think?
General-purpose lossless compression isn't useful for photos -- in some cases it can even increase the file size.
If they're in an archival format, and you don't want to compromise the image quality, you might get some very marginal benefit from general-purpose compression, but you're probably better off just archiving them to a NAS or even optical media -- optical discs are solid-state and electrically/magnetically inert, so they're still the best format for long-term archiving, and BD-R discs are fairly cheap.
If you don't mind potentially losing small amounts of quality, and the photos are already JPEGs, you can try using a high-efficiency JPEG encoder like guetzli to recompress your images.
what about JPEG2000? offers also improved compression over JPEG + transparency -which should be much better for the PNGs
here is a library with no further dependencies https://github.com/faceless2/jpeg2000
also, when sticking with JPEG, there a now much better JPEG compressors available than the standard old ones, offering 20% percent smaller size for the same quality (e.g. https://github.com/google/guetzli)
If you have uncompressed originals that you want to encode into JPEGs and lots of CPU time to burn, you should look into guetzli. It has some funky weird aspects to it, though. From the README:
> Note: Guetzli uses a large amount of memory. You should provide 300MB of memory per 1MPix of the input image. > > Note: Guetzli uses a significant amount of CPU time. You should count on using about 1 minute of CPU per 1 MPix of input image. > > Note: Guetzli assumes that input is in sRGB profile with a gamma of 2.2. Guetzli will ignore any color-profile metadata in the image.
For JPEGs I can highly recommend https://github.com/google/guetzli although it takes quite some processing power and time - the results are really sweet. I built this on my build server and have it run a queue. New images in my applications get put into the queue and the buildserver compresses in small batches and puts them into the applications.
In the future I want to have an own server for it, but for now it does just fine.
For PNGs i use pngquant the same way as guetzli.
​
Others have stated using imagesets and I suggest you try it out! Also lazyloading was mentioned before! Furthermore look into more modern imageformats aswell :) webp, jpeg2000 etc.
Not file donates but tools to help you losslessly compress images...
Guetzli is probably one of the best if not the best in terms of getting as close to the theoretical minimum sizes but takes the longest time.
https://github.com/google/guetzli/
ImageOptim is a nice GUI for MacOS that implements guetzli (disables by default) as well as many other image tools to get the lowest image size as you can. Both lossless and lossy (lossless by default).
Don’t use MacOS? No problem. Instead of listing a bunch more why don’t I just link you to the ImageOptim page that has then listed already for me!
https://imageoptim.com/versions
I hope this helps!
Someone needs to tell Nelson that there's a handy OS util on SF called "file optimizer" that stacks some ot the best ported image optimizers to get nearly unbeatable results.
The only thing that it was missing last time I looked was support for running guetzli (Google's util) on JPG files at maximum lossy settings (80% quality?).
So if you go for JPG, I'd run guetzli first:
guetzli_windows_x86-64.exe --quality 84 input.jpg output.jpg
.. and then pass the output file off to File Optimizer to clean up any meta data/classic optimizations that guetzli ignores.