MyBB Internal: One or more warnings occured. Please contact your administrator for assistance.
WebPagetest Forums - JPEGs optimization

WebPagetest Forums

Full Version: JPEGs optimization
You're currently viewing a stripped down version of our content. View the full version with proper formatting.
Pages: 1 2
Hi all.

I have found these two papers:
which present a compression method fully compatible with existing JPEG decoders, that achieve a 30% reduction on file size without noticeable degradation and the method is quick (0.2 seconds for typical images).

It is also claimed that this algorithm achives the maximum compression that JPEG encoders fully compatible with existing JPEG decoders can achieve.

As photoshop does not have it, do you know if there is a software that implements this algorithm?
You couldn't have asked at a better time - this just rolled across TechCrunch:
(09-07-2011 02:21 AM)pmeenan Wrote: [ -> ]You couldn't have asked at a better time - this just rolled across TechCrunch:

I'm not sure that jpegmini is using those kind of "free" (i.e. encoding) optimizations (joint optimization of quantization table + run-length encoding + Huffman table). Last time I tried it, it seemed to me that it was optimizing the image, but not the encoding (+30% size reduction). I'll give it another try and see.
Yeah, I haven't seen any of the advanced optimizations applied to libjpeg (which is what just about everything except for Photoshop is based on). jpegtran can do some basic huffman table optimizations to squeeze a little space from images but not on the order of what the researchers were seeing.
I believe that jpegmini is not using any optimization of quantization tables + run-length encoding + Huffman encoding, because by using optimized quantization tables alone I get images without visible artifacts and significantly smaller. At best, I believe that jpegmini is only using some smoothing.

To reach this conclusion, I used the following quantization table optimizer:

Which is described on this paper: Extending RD-OPT with Global Thresholding for JPEG Optimization

Curiously, the authors of the two papers I mentioned before have published another one where they get a 40-45% file size reduction by using optimization of quantization tables + run-length encoding + arithmetic encoding:

Given that the respective patents are 8 and 6 years old:

I start to wonder if the aforementioned research, funded by public money, will ever be translated on a JPEG compressor for sale on the market using those algorithms.
Of the open source tools, I am still a big fan of jpegoptim and jpegtran.

Sorry if i am hijacking this thread, but it's a very closely related question. But what is the best image resizing and compressing library that has been exposed via php? Currently we're using imagick, any better ones out there?

p.s. Yes i know you can make calls to any on system executable via php, but i have that ability disabled for security sake, so it has to be exposed through php.
We all know too well that JPEG images are the main cause of page bloatedness due to the increase use of images and due to the inefficient JPEG compression algorithm.

On the other hand, PackJPG is a lossless compression software, is now open source under the terms of LGPL v3 and typically
it reduces the file size of a JPEG file by 23%...24% [cf. packJPG Usage Information].

Now follow some questions:
  • Do someone know if there are already browsers implementing this type of compression for JPEG images?
  • Are there ways of avoiding the IE Varies problem?
  • Is there a HTTP 2.0 way of allowing cacheable content negotiation and the inclusion of this [PackJPG] lossless compression algorithm?
IMO, JPEGs are the elephant in the room and paradoxically remain unoptimized while time is being wasted with higher order optimizations yielding lower returns.
Lena... I am not sure I agree. Jpeg is actually the most optimized lossy compression format I think. The only way to reduce it further is through binary compression. That may shrink the file size, but that will increase the CPU utilization to uncompress it, which will only serve to increase the browser rendering time. Palleted formats like .gif and 8 bit png and others can be more efficient, but only because they are encoding the image data to a limited number of colors in the pallet, and let's not forget about vector based formats, but as you are probably aware, both of these do not serve well for images like photographs and graphics containing gradations. I would say, it's more of the choose the best tool for the job than anything else.
The problem with PackJPG (as best as I can tell) is that the output is a compressed file that is no longer a JPEG and needs to be decompressed with PackJPG. At that point you're either trying to get a new image format adopted or a new content-encoding (akin to gzip for text).

At that point you're better off using WebP which already has decent browser support and shows better improvement than loslessly compressing JPEGs.
Pages: 1 2
Reference URL's