Current time: 12-19-2014, 11:30 AM Hello There, Guest! (LoginRegister)

Post Reply 
 
Thread Rating:
  • 0 Vote(s) - 0 Average
  • 1
  • 2
  • 3
  • 4
  • 5
JPEGs optimization
09-03-2011, 08:44 PM
Post: #1
JPEGs optimization
Hi all.

I have found these two papers:
  • http://ieeexplore.ieee.org/iel5/83/4711346/04682736.pdf?arnumber=4682736
  • http://ieeexplore.ieee.org/iel5/4378863/4379219/04379276.pdf?arnumber=4379276
which present a compression method fully compatible with existing JPEG decoders, that achieve a 30% reduction on file size without noticeable degradation and the method is quick (0.2 seconds for typical images).

It is also claimed that this algorithm achives the maximum compression that JPEG encoders fully compatible with existing JPEG decoders can achieve.

As photoshop does not have it, do you know if there is a software that implements this algorithm?
Find all posts by this user
Quote this message in a reply
09-07-2011, 02:21 AM
Post: #2
RE: JPEGs optimization
You couldn't have asked at a better time - this just rolled across TechCrunch: http://techcrunch.com/2011/09/06/new-sta...r-quality/
Visit this user's website Find all posts by this user
Quote this message in a reply
09-07-2011, 09:59 PM
Post: #3
RE: JPEGs optimization
(09-07-2011 02:21 AM)pmeenan Wrote:  You couldn't have asked at a better time - this just rolled across TechCrunch: http://techcrunch.com/2011/09/06/new-sta...r-quality/

I'm not sure that jpegmini is using those kind of "free" (i.e. encoding) optimizations (joint optimization of quantization table + run-length encoding + Huffman table). Last time I tried it, it seemed to me that it was optimizing the image, but not the encoding (+30% size reduction). I'll give it another try and see.
Find all posts by this user
Quote this message in a reply
09-08-2011, 12:03 AM
Post: #4
RE: JPEGs optimization
Yeah, I haven't seen any of the advanced optimizations applied to libjpeg (which is what just about everything except for Photoshop is based on). jpegtran can do some basic huffman table optimizations to squeeze a little space from images but not on the order of what the researchers were seeing.
Visit this user's website Find all posts by this user
Quote this message in a reply
09-15-2011, 02:49 AM
Post: #5
RE: JPEGs optimization
I believe that jpegmini is not using any optimization of quantization tables + run-length encoding + Huffman encoding, because by using optimized quantization tables alone I get images without visible artifacts and significantly smaller. At best, I believe that jpegmini is only using some smoothing.

To reach this conclusion, I used the following quantization table optimizer: http://pages.cs.wisc.edu/~ratnakar/rdopt.tar.gz

Which is described on this paper: Extending RD-OPT with Global Thresholding for JPEG Optimization

Curiously, the authors of the two papers I mentioned before have published another one where they get a 40-45% file size reduction by using optimization of quantization tables + run-length encoding + arithmetic encoding:

http://ieeexplore.ieee.org/ielx5/5075165/5090078/05090215.pdf?arnumber=5090215

Given that the respective patents are 8 and 6 years old:

I start to wonder if the aforementioned research, funded by public money, will ever be translated on a JPEG compressor for sale on the market using those algorithms.
Find all posts by this user
Quote this message in a reply
09-15-2011, 02:56 AM (This post was last modified: 09-15-2011 02:57 AM by Aaron Kulick.)
Post: #6
RE: JPEGs optimization
Of the open source tools, I am still a big fan of jpegoptim and jpegtran.

Aaron
Find all posts by this user
Quote this message in a reply
09-17-2011, 01:45 AM
Post: #7
RE: JPEGs optimization
Sorry if i am hijacking this thread, but it's a very closely related question. But what is the best image resizing and compressing library that has been exposed via php? Currently we're using imagick, any better ones out there?

p.s. Yes i know you can make calls to any on system executable via php, but i have that ability disabled for security sake, so it has to be exposed through php.
Find all posts by this user
Quote this message in a reply
12-03-2013, 11:52 PM (This post was last modified: 12-03-2013 11:55 PM by lena.)
Post: #8
RE: JPEGs optimization
We all know too well that JPEG images are the main cause of page bloatedness due to the increase use of images and due to the inefficient JPEG compression algorithm.

On the other hand, PackJPG is a lossless compression software, is now open source under the terms of LGPL v3 and typically
it reduces the file size of a JPEG file by 23%...24% [cf. packJPG Usage Information].

Now follow some questions:
  • Do someone know if there are already browsers implementing this type of compression for JPEG images?
  • Are there ways of avoiding the IE Varies problem?
  • Is there a HTTP 2.0 way of allowing cacheable content negotiation and the inclusion of this [PackJPG] lossless compression algorithm?
IMO, JPEGs are the elephant in the room and paradoxically remain unoptimized while time is being wasted with higher order optimizations yielding lower returns.
Find all posts by this user
Quote this message in a reply
12-04-2013, 12:34 AM
Post: #9
RE: JPEGs optimization
Lena... I am not sure I agree. Jpeg is actually the most optimized lossy compression format I think. The only way to reduce it further is through binary compression. That may shrink the file size, but that will increase the CPU utilization to uncompress it, which will only serve to increase the browser rendering time. Palleted formats like .gif and 8 bit png and others can be more efficient, but only because they are encoding the image data to a limited number of colors in the pallet, and let's not forget about vector based formats, but as you are probably aware, both of these do not serve well for images like photographs and graphics containing gradations. I would say, it's more of the choose the best tool for the job than anything else.
Find all posts by this user
Quote this message in a reply
12-04-2013, 12:59 AM
Post: #10
RE: JPEGs optimization
The problem with PackJPG (as best as I can tell) is that the output is a compressed file that is no longer a JPEG and needs to be decompressed with PackJPG. At that point you're either trying to get a new image format adopted or a new content-encoding (akin to gzip for text).

At that point you're better off using WebP which already has decent browser support and shows better improvement than loslessly compressing JPEGs.
Visit this user's website Find all posts by this user
Quote this message in a reply
Post Reply 


Forum Jump:


User(s) browsing this thread: 1 Guest(s)