Current time: 12-14-2017, 10:11 AM Hello There, Guest! (LoginRegister)

Post Reply 
 
Thread Rating:
  • 0 Vote(s) - 0 Average
  • 1
  • 2
  • 3
  • 4
  • 5
JPEGs optimization
12-04-2013, 02:42 AM
Post: #11
RE: JPEGs optimization
@ jarrod1937

I prize your sense o humor, calling JPEG "the most optimized lossy compression format", as it can be further compressed losslessly another 23...24%.

@ pmeenan

I was thinking along the line of a new content-encoding. That's why I also mentioned the IE Varies problem.

Yea, you have a good point concerning decent browser support of WebP.

Concerning WebP, is there a way of getting content negotiation [JPEG <-> WebP] and cacheability at the same time without resorting to the auto-defeating Vary: User Agent header? Alternatively, is there a way of doing it with HTTP 2.0?
Find all posts by this user
Quote this message in a reply
12-04-2013, 03:04 AM
Post: #12
RE: JPEGs optimization
The current effort for negotiation is around vary: accept. Opera has always announced webp support in their accept headers and Chrome added it recently so all/both of the browsers that support WebP announce it.

CDN/proxy support for vary: accept is pretty weak right now though it is improving (Akamai is one of the first that I know that is supporting it).

HTTP 2.0 doesn't really add any mechanisms that help other than being encrypted by default so you can feel safe that you're bypassing any intermediaries.
Visit this user's website Find all posts by this user
Quote this message in a reply
12-04-2013, 04:29 AM (This post was last modified: 12-04-2013 04:31 AM by jarrod1937.)
Post: #13
RE: JPEGs optimization
(12-04-2013 02:42 AM)lena Wrote:  @ jarrod1937

I prize your sense o humor, calling JPEG "the most optimized lossy compression format", as it can be further compressed losslessly another 23...24%.

Indeed, which is why I said the most optimized lossy format, any format can losslessly be compressed more, but it's a matter of CPU utilization required to achieve that both on the compression end, and the decompression end. CPU utilization during browser rendering is not something to ignore, and lossless compression techniques that can be tacked on only add to this, not to mention the diminishing returns it takes to compress the file that much. It takes considerably more processing power to compress an image more and more as the low hanging fruit of the binary level compression gets taken. Technically you can tell your browser to do gzip compression right now for images, but it will actually slow down your overall page display time (from my testing) for the little bit of download size it saves. I'm know there are better lossless compression techniques that can be used other than those used by gzip, but I still doubt the tradeoff is worth it, and those would require a new format adoption as mentioned, not an easy task ;-)
Find all posts by this user
Quote this message in a reply
12-04-2013, 08:37 PM (This post was last modified: 12-04-2013 08:39 PM by lena.)
Post: #14
RE: JPEGs optimization
(12-04-2013 03:04 AM)pmeenan Wrote:  The current effort for negotiation is around vary: accept. Opera has always announced webp support in their accept headers and Chrome added it recently so all/both of the browsers that support WebP announce it.

Why not use the HTTP response header Link? The URL is the URL of a directory containing the images and the rel attribute [with a chosen and agreed name, for instance jpg->webp] is used as a hint for browsers to ask "image.webp" instead of "image.jpg" for the images located in a sub directory of the aforementioned URL.

It seems to be past proof [no side effects with current browsers and I would guess that proxies already support this response header (i.e. Link)] and future proof [it's up to the browsers to opt in].
Find all posts by this user
Quote this message in a reply
12-05-2013, 03:15 AM
Post: #15
RE: JPEGs optimization
Not sure I understand where you'd pass the header or how you'd use it. Using it on the base page ties all sorts of conventions in with it's use that are unlikely to work for a lot of sites and applying it to individual resources implies that the image was already requested so it would cause a double fetch.

We considered a srcset style solution that required markup to define alternate image formats but adoption would take forever and it's not all that clean.

By delivering different file formats for the same URI based on advertised capabilities, it opens up support for automatic image transcoding services and delivery without having to change the pages themselves at all.
Visit this user's website Find all posts by this user
Quote this message in a reply
12-05-2013, 07:17 AM
Post: #16
RE: JPEGs optimization
New compression algorithms will not help unless the algorithms are implemented in all Browsers. If a new algorithm was added to the major Browsers today it will be many years before that algorithm would be practical to use because older Browsers would not be able to render the image.

Yahoo Smush.it is hard to beat: http://www.smushit.com/ysmush.it/
Find all posts by this user
Quote this message in a reply
12-06-2013, 02:04 AM
Post: #17
RE: JPEGs optimization
@pmeenan

I was thinking at HTML level [document]. Be it the HTTP headers or in the head section of the HTML document. Probably it is equivalent to the srcset proposal. The advantage is that it has no side effects with current browsers [i.e. the IE Vary problem] nor with current proxies, because the URL is different [cacheability without problems] and because proxies do not have to learn new tricks, such as vary: accept.

On the other hand, I concede that the "vary: accept" is slightly simpler from author's point of view. I might be wrong, but I would guess that IE <= 9 would have problems with it, i.e. the images are not cached by those browsers.
Find all posts by this user
Quote this message in a reply
Post Reply 


Forum Jump:


User(s) browsing this thread: 1 Guest(s)