Current time: 12-13-2017, 11:29 AM Hello There, Guest! (LoginRegister)

Post Reply 
Thread Rating:
  • 0 Vote(s) - 0 Average
  • 1
  • 2
  • 3
  • 4
  • 5
Have you played with Mozjpeg? (image optimization)
02-02-2015, 12:46 AM
Post: #1
Have you played with Mozjpeg? (image optimization)
If you don't know, Mozjpeg is a modified version of libjpeg-turbo-tools that will compress jpeg images more than standard jpegtran/jpegoptim without loss of any quality (lossless).

But what's really magical about Mozjpeg is the utility cjpeg which compresses the jpeg files - this is where you can really save bytes without losing much quality.

You can read more about mozjpeg here:

I've been playing around with mozjpeg (mostly cjpeg) to see if I can't optimize my client's images kinda automatic without much loss of quality.

In this example, I'm only using one image for quality checks.
This jpeg image is of 99% quality factor and is 500kB (huge).

The jpeg image is a huge background image in a "hero" area in above the fold.

Here are the results of my experiments:
-rw-r--r--  1 sigurdur sigurdur  480142 feb  1 11:34 home-page-image-big-1jpegtranmozjpeg.jpg
-rw-r--r--  1 sigurdur sigurdur  481715 feb  1 11:34 home-page-image-big-1jpegtranorig.jpg
-rw-r--r--  1 sigurdur sigurdur   74494 feb  1 02:26 home-page-image-big-1mozjpegcjpeg75.jpg
-rw-r--r--  1 sigurdur sigurdur   79487 feb  1 12:27 home-page-image-big-1mozjpegcjpeg80.jpg
-rw-r--r--  1 sigurdur sigurdur   95119 feb  1 12:28 home-page-image-big-1mozjpegcjpeg85.jpg
-rw-r--r--  1 sigurdur sigurdur  126882 feb  1 12:28 home-page-image-big-1mozjpegcjpeg90.jpg
-rw-r--r--  1 sigurdur sigurdur  183337 feb  1 12:29 home-page-image-big-1mozjpegcjpeg95.jpg
-rw-r--r--  1 sigurdur sigurdur   83587 feb  1 12:51 home-page-image-big-1mozjpeg75-notrellis.jpg
-rw-r--r--  1 sigurdur sigurdur  102665 feb  1 12:51 home-page-image-big-1mozjpeg75-notrellis-sample1x1.jpg
-rw-r--r--  1 sigurdur sigurdur   90285 feb  1 02:32 home-page-image-big-1mozjpeg75-psnr.jpg
-rw-r--r--  1 sigurdur sigurdur   91033 feb  1 12:48 home-page-image-big-1mozjpeg75-sample1x1.jpg
-rw-r--r--  1 sigurdur sigurdur   96669 feb  1 02:33 home-page-image-big-1mozjpeg75-ssim.jpg
-rw-r--r--  1 sigurdur sigurdur   64634 feb  1 02:33 home-page-image-big-2mozjpeg75-ms-ssim.jpg
-rw-------  1 sigurdur sigurdur   85863 feb  1 02:04 home-page-image-big-75.jpg
-rw-------  1 sigurdur sigurdur   99570 feb  1 02:04 home-page-image-big-80.jpg
-rw-------  1 sigurdur sigurdur  118615 feb  1 13:45 home-page-image-big-85.jpg
-rw-------  1 sigurdur sigurdur  152589 feb  1 13:45 home-page-image-big-90.jpg
-rw-r--r--  1 sigurdur sigurdur  502793 feb  1 01:36 home-page-image-big.jpg

So originally, the image is 502793, and if we use jpegtran/jpegoptim to losslessly optimize it, it shrinks to 481715 bytes.
If we use Mozjpeg 3.0 jpegtran the image losslessly shrinks to 480142 bytes (~1.5kB smaller than original jpegtran)

The image is still friggin' huge (because of quality factor 99%)

Well, after testing various things on this image and how it affects the size, quality and download speed, I've found that using 90% quality with mozjpeg (or 85% quality using "normal" compression tools) is probably the best in regards to quality/size.

The difference between "normal" jpeg quality 85 and mozjpeg quality 90 is huge, quality wise.
Even when only comparing the quality between Mozjpeg Q85 and normal Q85, it looks to me as if it has fewer artifacts (due to some smoothing algorithm or my internal bias).

If we calculate how much download time difference there is with Q85 and Q90 (31kB) on Cable (5mbps), we find that we only add about 60ms..
31÷(5000×.85÷8) = 0,058352941

Of course if there were multiple files like this needed for the initial download, then the 60ms aren't worth it, but since this is the only big image then it's worth it.

Mozjpeg is better at losslessly compressing (0-10% better than jpegoptim) jpeg files
cjpeg from Mozjpeg is a fantastic way to do an extreme lossy compression without losing much quality.

Have you played with Mozjpeg?

All websites deserve to load fast.
Find all posts by this user
Quote this message in a reply
02-02-2015, 02:09 AM
Post: #2
RE: Have you played with Mozjpeg? (image optimization)


I am modifying my automation script to optimize images that are above 90% quality, and was testing the "identify" command from ImageMagick..

Look at what it thinks about the 90% quality file from cjpeg vs a "normal" 90% quality file:
sigurdur@sigurdur-ThinkPad-W510 ~/tmp $ identify -format "%Q" home-page-image-big-1mozjpegcjpeg90.jpg
sigurdur@sigurdur-ThinkPad-W510 ~/tmp $ identify -format "%Q" home-page-image-big-90.jpg

The identify tool reverse engineers the image to figure out the quality setting .. but it doesnt seem to figure out how to deal with mozjpeg's modification .. (I need to post a msg to the imagemagick group)

That sparked another test idea in my mind .. what would happen if I used the same quality setting as the original image ..? (99%)

sigurdur@sigurdur-ThinkPad-W510 ~/tmp $ mozjpeg/cjpeg -quality 99 -optimize -progressive -outfile home-page-image-big-1mozjpegcjpeg99.jpg home-page-image-big.jpg
sigurdur@sigurdur-ThinkPad-W510 ~/tmp $ ll home-page-image-big-1mozjpegcjpeg99.jpg home-page-image-big.jpg
-rw-r--r-- 1 sigurdur sigurdur 364389 feb  1 15:51 home-page-image-big-1mozjpegcjpeg99.jpg
-rw-r--r-- 1 sigurdur sigurdur 481715 feb  1 01:36 home-page-image-big.jpg
sigurdur@sigurdur-ThinkPad-W510 ~/tmp $ identify -format "%Q" home-page-image-big-1mozjpegcjpeg99.jpg
sigurdur@sigurdur-ThinkPad-W510 ~/tmp $ identify -format "%Q" home-page-image-big.jpg

Same quality factor in the construction, but calculated quality factor is different (-1)

What if we measure the actual difference?
sigurdur@sigurdur-ThinkPad-W510 ~/tmp $ compare  home-page-image-big.jpg home-page-image-big-1mozjpegcjpeg99.jpg 99diff.png
sigurdur@sigurdur-ThinkPad-W510 ~/tmp $ compare -verbose -metric mae home-page-image-big.jpg home-page-image-big-1mozjpegcjpeg99.jpg 99diff.png
home-page-image-big.jpg JPEG 1600x700 1600x700+0+0 8-bit DirectClass 482KB 0.080u 0:00.080
home-page-image-big-1mozjpegcjpeg99.jpg JPEG 1600x700 1600x700+0+0 8-bit DirectClass 364KB 0.060u 0:00.059
Image: home-page-image-big.jpg
  Channel distortion: MAE
    red: 175.566 (0.00267897)
    green: 103.89 (0.00158526)
    blue: 193.894 (0.00295863)
    all: 157.783 (0.00240762)
home-page-image-big.jpg=>99diff.png JPEG 1600x700 1600x700+0+0 8-bit DirectClass 1.356MB 0.770u 0:00.399

less than 0.3% difference

Well .. what if we try creating a "normal" 98% quality image and measure the difference..

sigurdur@sigurdur-ThinkPad-W510 ~/tmp $ cp home-page-image-big.jpg home-page-image-big-q98.jpg
sigurdur@sigurdur-ThinkPad-W510 ~/tmp $ jpegoptim -m98 --strip-all home-page-image-big-q98.jpg
home-page-image-big-q98.jpg 1600x700 24bit P JFIF  [OK] 481715 --> 327266 bytes (32.06%), optimized.
sigurdur@sigurdur-ThinkPad-W510 ~/tmp $ compare  home-page-image-big.jpg home-page-image-big-q98.jpg 99-98diff.png
sigurdur@sigurdur-ThinkPad-W510 ~/tmp $ compare -verbose -metric mae home-page-image-big.jpg home-page-image-big-q98.jpg 99-98diff.png
home-page-image-big.jpg JPEG 1600x700 1600x700+0+0 8-bit DirectClass 482KB 0.080u 0:00.070
home-page-image-big-q98.jpg JPEG 1600x700 1600x700+0+0 8-bit DirectClass 327KB 0.050u 0:00.059
Image: home-page-image-big.jpg
  Channel distortion: MAE
    red: 203.366 (0.00310317)
    green: 123.291 (0.00188131)
    blue: 224.797 (0.00343019)
    all: 183.818 (0.00280489)
home-page-image-big.jpg=>99-98diff.png JPEG 1600x700 1600x700+0+0 8-bit DirectClass 1.343MB 0.780u 0:00.410
sigurdur@sigurdur-ThinkPad-W510 ~/tmp $ identify -format "%Q" home-page-image-big-q98.jpg

So that produces a slightly smaller file with a slightly more changes than the mozjpeg 99% conversion.

So .. mozjpeg effectively is reducing the quality (based on reverse engineering the quantization tables), but keeps visual quality high compared to "normal" quality conversion.

Well, I see no reason to not use mozjpeg, in fact I see every reason to use it to optimize images .. maybe a mozjpeg developer can chime in?

All websites deserve to load fast.
Find all posts by this user
Quote this message in a reply
02-03-2015, 10:36 PM
Post: #3
RE: Have you played with Mozjpeg? (image optimization)
Kornel (the creator of imageoptim and pngquant) replied to my question about JPG quality (as assumed by in ImageMagick).

Hat's off to him for doing that.

He also created a comparison utility to compare human perceived visual quality between images.

Quote:Internally JPEG doesn't actually use a single number for quality. It uses quantization tables which are like 64 individual quality settings per channel, affecting different kinds of details in the image.

How a single quality setting is translated to quantization tables (and back) is arbitrary, and a bit of an art. There isn't a single perfect way (that we know of).

My guess is that ImageMagick assumes JPEGs are encoded with libjpeg's quantization table, but mozjpeg by default uses a differently-tuned quantization table.

In this case lower number doesn't necessarily mean actually lower quality, but may mean that ImageMagick's quality-guessing algorithm doesn't do a good job on quantization tables it wasn't designed for.

And finally, JPEG quality setting is a very blunt tool and only a rough approximation of actual perceived quality. Some images look good at quality set to 30, some images look awful at quality setting = 70. There are even edge cases where lowering JPEG quality gives nicer-looking image (

To compare quality of very similar images you need to use a tool that tries to approximate human vision. I wrote and a guide for this

regards, Kornel

All websites deserve to load fast.
Find all posts by this user
Quote this message in a reply
04-04-2015, 01:59 PM
Post: #4
RE: Have you played with Mozjpeg? (image optimization)
Hi sig, have you played around with jpegmini at all? It tends to do a great job but it hides the settings it uses.
Visit this user's website Find all posts by this user
Quote this message in a reply
04-04-2015, 09:19 PM
Post: #5
RE: Have you played with Mozjpeg? (image optimization)
Hey avalanch,

Yeah .. they use a patent-pending algorithm to change the quantification tables somehow.
I think they change the q-tables on the fly and select the best tables by analyzing the image and choosing to use high quality compression in areas where the eye is likely to be watching, and low quality compression where the eye won't look.

So, this is kinda like how mozjpeg is doing things, but it seems to go further than mozjpeg.

It's kinda like if you would mix Kornel's dssim tool with mozjpeg to find the ultimate mix of quality and compression.

Here are some very intelligent answers about the jpegmini compression on Quora:

All websites deserve to load fast.
Find all posts by this user
Quote this message in a reply
Post Reply 

Forum Jump:

User(s) browsing this thread: 1 Guest(s)