Current time: 11-14-2019, 05:22 AM Hello There, Guest! (LoginRegister)

Post Reply 
 
Thread Rating:
  • 0 Vote(s) - 0 Average
  • 1
  • 2
  • 3
  • 4
  • 5
Untraditional speed improvement achieved
05-31-2011, 06:17 PM
Post: #1
Untraditional speed improvement achieved
Hello,

Curiously I recently ran a page speed test of index page at http://www.background-checks-systems.com and noticed it responded with 34 http requests due to plenty of image lookups that were not in that page, but rather in other pages of site. A bit perplexed, I further researched as to see why.

It appears that when I merged all css files into one global file, many divisions contained background css images although not rendered by index page, but listed from other pages in the site -- it still created unnecessary http requests when robot crawled css file link.

I ran additional test by placing css directly into main page using <style type="text/css"> after getting rid of all unused styles from global css file and amazingly my first view http request went down to 14 from 34, file size decreased by more than half and load time down to 1.061s from 4.32s

Perhaps this method may work for your high-traffic pages (if not, at least your index page) whereby you can still call your css link for formatting while creating in-page head css file for background images.
Logistically it may seem like a bit of work, but keep in mind that is only for pages that contain different image backgrounds. If the same background is used more commonly throughout the site, by all means leave it in your main css file.

Anyone else experience this? or maybe I was just ignorant to this?

Joe
Visit this user's website Find all posts by this user
Quote this message in a reply
06-01-2011, 12:30 AM
Post: #2
RE: Untraditional speed improvement achieved
Is it possible that something about how you specified the background image triggered them to be downloaded? I've never seen CSS elements get downloaded that weren't actually referenced on the page.

If your CSS is small enough there is no question that inlining it will be faster (at least for first-time visitors). It does mean that you can't serve it from a CDN and that you will also be serving the CSS to bots that may not need it.

-Pat
Visit this user's website Find all posts by this user
Quote this message in a reply
06-01-2011, 08:08 AM
Post: #3
RE: Untraditional speed improvement achieved
Hi Pat,

Thanks for your response. I'm also puzzled as to why as I also ran it through websiteoptimization.com speed test & Pingdom DNS and got same results.

Here's some random background imaging code of my global css file:

#logo{color:#fff;background:url(images-home/index_07.png) #000;background-repeat:repeat;height:100px;margin:0}

.blogstyle .colmid{float:left;width:200%;margin-left:-200px;position:relative;right:100%;background:url(images/BCSbackground3.jpg) #000;background-repeat:repeat}

div.c7{margin-top:5px;background:#fff url(images/piecesfit.jpg) no-repeat 0 0;border:none;padding:0 5px 0 188px}

Later on, I'll strip index page css and reference it back to global css link, so that I can see waterfall view of what images bots are crawling.

I'll post results.

Joe
Visit this user's website Find all posts by this user
Quote this message in a reply
06-02-2011, 01:14 AM
Post: #4
RE: Untraditional speed improvement achieved
FWIW, WebPagetest isn't really a "bot" and it isn't crawling. It is just loading the page in an actual browser and observing what the browser does. I'm assuming the pages that were downloading unexpected resources didn't have the elements that the css was referring to anywhere on the page which is what has me confused.
Visit this user's website Find all posts by this user
Quote this message in a reply
06-02-2011, 03:18 PM
Post: #5
RE: Untraditional speed improvement achieved
I have not been able to duplicate this anomaly again.

To be honest with you, those high numbers of HTTP requests and file sizes where obtained using websiteoptimization.com speed test. Since then, I've placed all styles back into main CSS and also changed declaration of index page to HTML5 from XHTML Strict. Got some good numbers back from webpagetest.org and the other site aforementioned.

I'm starting to wonder if the cause of that anomaly would have been mod_pagespeed caching since I had been doing editing prior.

Oddly enough, when I go to /var/www/mod_pagespeed/cache/http,3A I see all domains that I host and a couple of other unknown domains that I do not host in my server (ex. 2Fdn-auctions.com, 2Fanswers.yahoo.com & 2Ftomomanddad.info) mind you, they are not in my vhosts.

Your thoughts?

Joe
Visit this user's website Find all posts by this user
Quote this message in a reply
06-03-2011, 03:10 AM
Post: #6
RE: Untraditional speed improvement achieved
Probably worth sending a question over to the mod_pagespeed discussion group. As far as the extra resources go, if the test was done with a site that emulates a browser instead of using a full browser then I wouldn't be surprised if it downloaded everything it saw in the CSS blindly - I'm a huge proponent of only doing testing with real browsers because that's the only way you will get the actual site behavior (including javascript execution, DOM manipulation, etc).
Visit this user's website Find all posts by this user
Quote this message in a reply
06-03-2011, 09:55 AM
Post: #7
RE: Untraditional speed improvement achieved
(06-03-2011 03:10 AM)pmeenan Wrote:  As far as the extra resources go, if the test was done with a site that emulates a browser instead of using a full browser then I wouldn't be surprised if it downloaded everything it saw in the CSS blindly

You are 100% correct about real-time full browser behavior.
I just ran http://www.mod-page-speed.com at websiteoptimization.com (WSO) and got:

Total HTTP Requests: 16
Total Size: 184272 bytes

webpagetest.org correctly listed 9 HTTP requests

Further down the page, I see that WSO listed all CSS IMG objects from main stylesheet that are not in index page.
Does Google pagespeed & Yahoo's Yslow use this same type of browser emulation when calculating page speed? if so, they would be reading non-existent objects in a page which leads me back to original thought of doing on-page CSS inlining of background images.

Joe
Visit this user's website Find all posts by this user
Quote this message in a reply
06-03-2011, 10:01 AM (This post was last modified: 06-03-2011 10:02 AM by pmeenan.)
Post: #8
RE: Untraditional speed improvement achieved
No, YSlow and Page Speed are browser plugins that run within a real browser (well, technically Page Speed has an SDK so you can hook it up to anything but all implementations I am aware of are with a real browser).
Visit this user's website Find all posts by this user
Quote this message in a reply
06-03-2011, 10:11 AM
Post: #9
RE: Untraditional speed improvement achieved
I was actually referring to Googlebot & Yslurp when compiling your site's load time as a ranking factor.

Joe
Visit this user's website Find all posts by this user
Quote this message in a reply
06-03-2011, 10:49 AM
Post: #10
RE: Untraditional speed improvement achieved
Not sure what Yslurp does. From what I've seen published on the web, the Google speed information comes from real users with real browsers that visit your site (that have the Google toolbar installed and have enabled the pagerank information sharing).
Visit this user's website Find all posts by this user
Quote this message in a reply
Post Reply 


Forum Jump:


User(s) browsing this thread: 1 Guest(s)