Current time: 12-15-2017, 12:52 PM Hello There, Guest! (LoginRegister)

Post Reply 
 
Thread Rating:
  • 0 Vote(s) - 0 Average
  • 1
  • 2
  • 3
  • 4
  • 5
Green-Watch.org Site Optimization
04-28-2010, 03:26 AM
Post: #11
RE: Green-Watch.org Site Optimization
(04-28-2010 02:02 AM)green-watch.org Wrote:  I did create a menu icon sprite for all the 16x16 icons in the javascript menu. My skill set is quite large but CSS is probably my least favorite thing to do. I did look at that SpriteMe application. It did not recommend making a sprite out of background images due to some being JPEG images and others repeated themselves. JPEGS can easily be converted but can a sprite handle repeat-x and repeat-y images?

It really depends on what's in the images. If we are talking about graphics as background images then they should probably all be sprited into one PNG8 (and thus converted from JPEGs if that's what they are now). If we are dealing with photographs then you need the larger color pallet of a JPEG, and you probably don't want to sprite them (unless you have lots of small JPEGs that are all on the same page). You can repeat sprites easily, but it will repeat the entire sprite not a specific section of it. So if you have repeatable gradients or something your best bet is to put all of the gradients in a repeat sprite (could be 1 pixel wide if they are vertical gradients).

(04-28-2010 02:02 AM)green-watch.org Wrote:  When people say javascript files should be merged together into one file, do they mean all synchronous javascript should?

People usually mean that you want to reduce the overall number of HTTP requests, but yes it typically applies to synchronous JS. JS files that are loaded asynchronously on demand can be split into feature specific files without any problems.

The issue with your example is that you might have some JS inline on the page that requires external JS to work. If this is the case you would run into a race condition where the external file is racing to download and the page is racing to execute the code on it. There are ways to tie external JS loading to a script block on the page, so you just have to be careful with how you manage your dependencies.

So yes, you could load all of your JS asynchronously and it would reduce blocking and make your page load faster, but you have to make sure that everything still works on the page when you do this.
Visit this user's website Find all posts by this user
Quote this message in a reply
04-28-2010, 03:52 AM
Post: #12
RE: Green-Watch.org Site Optimization
Hey there,

(04-28-2010 03:26 AM)jklein Wrote:  It really depends on what's in the images. If we are talking about graphics as background images then they should probably all be sprited into one PNG8 (and thus converted from JPEGs if that's what they are now). If we are dealing with photographs then you need the larger color pallet of a JPEG, and you probably don't want to sprite them (unless you have lots of small JPEGs that are all on the same page). You can repeat sprites easily, but it will repeat the entire sprite not a specific section of it. So if you have repeatable gradients or something your best bet is to put all of the gradients in a repeat sprite (could be 1 pixel wide if they are vertical gradients).

http://images1.green-watch.org/img/jpg/bodyBG.jpg
http://images2.green-watch.org/img/jpg/register1.jpg
http://images3.green-watch.org/img/jpg/register.jpg
http://images2.green-watch.org/img/jpg/bg_body3.jpg
http://www.green-watch.org/img/png/buttonBG.png
http://www.activegreen.net/img/png/bg_body11.png
http://www.green-watch.org/img/png/bg_input_1.png
http://www.green-watch.org/img/gif/bt_logout.gif
http://images1.green-watch.org/img/gif/b...t-over.gif
http://images2.green-watch.org/img/leaf.gif
http://images1.green-watch.org/img/menu/menuIcons.png

Main Style Sheet:
http://www.green-watch.org/inc/style.css

Style sheet called dynamically from the zapatec library for menu:
http://www.green-watch.org/zapatec/zpmen...arblue.css

Let me know what you think. If it is possible to create one or two sprites off these to reduce the number of requests, I do not mind paying for this job as long as the price is reasonable.

It would be nice to be able to merge the style.css with the barblue.css file but I do not know how difficult that would be to adjust the zapatec menu javascript so it would not call the barblue.css. Maybe its best not worrying about that file since it is so small?

(04-28-2010 03:26 AM)jklein Wrote:  The issue with your example is that you might have some JS inline on the page that requires external JS to work. If this is the case you would run into a race condition where the external file is racing to download and the page is racing to execute the code on it. There are ways to tie external JS loading to a script block on the page, so you just have to be careful with how you manage your dependencies.

I did a quick asynchronous test on the combined JavaScript file and I can see a drastic difference already. Some of the JavaScript does not work because of the race condition you mentioned, but I have no problem putting a little effort into fixing that. The good news is my coldfusion pages are built on templates so changing the templates accordingly fixes most of the pages at once.

I will share my results once everything is up and running. Webpagetest.org has really reduced the time it takes for my pages to load with just a few simple changes. I think with a little more documentation and more people in the community (which should be occurring soon due to Google implementing speed as a factor for SERPS), this will be an extremely nice website. I love it so far!

Sincerely,
Travis Walters
Find all posts by this user
Quote this message in a reply
04-28-2010, 05:15 AM
Post: #13
RE: Green-Watch.org Site Optimization
(04-28-2010 02:02 AM)green-watch.org Wrote:  I did create a menu icon sprite for all the 16x16 icons in the javascript menu. My skill set is quite large but CSS is probably my least favorite thing to do. I did look at that SpriteMe application. It did not recommend making a sprite out of background images due to some being JPEG images and others repeated themselves. JPEGS can easily be converted but can a sprite handle repeat-x and repeat-y images?

Yes, but you will need to have different sprite images for the repeat-x and repeat-y sprites.

(04-28-2010 02:02 AM)green-watch.org Wrote:  When people say javascript files should be merged together into one file, do they mean all synchronous javascript should?

Let me shoot an idea and tell me what you think.

- Currently all JavaScript is contained in one huge file. It is minimized, gzip compressed, and takes about one second to load your FIOS test. It is contained in the head section so it does block the website.

- This JavaScript file contains form functions, prototype.js, scriptaculous.js library, prototip library, swfobject, and zapatec.js menu.

- I could setup five functions: importFormJS(), importPrototypeJS(), importScriptaculousJS(), importPrototipJS(), and importZapatecMenuJS()

- In each function I could have a XMLHttpRequest that would request gzip files of the libraries. The XMLHttpRequest would be asynchronous so it should not block other resources from being downloaded correct?

Would this method work to stop (or reduce) blocking? Would creating these requests be a bad thing to do? Maybe I am thinking of the wrong way to do this.

This is where "advanced optimization" collides with the easy recommendations for typical sites :-)

The blog entries I linked to had samples for pulling down and executing the javascript dynamically (the key is in the execution). the loading of the javascript itself is easy but where it requires specific application knowledge is around what the javascript code does.

You can not execute any javascript that is expecting the prototype library to have loaded before you actually load the library for example so inline script blocks can be troublesome when you try to delay loading of the libraries. You have to look at the code that uses the libraries, not just the libraries themselves and figure out how best to execute that code only after the library has downloaded. The blog articles provide sample code that shows how that could be implemented and it is supported somewhat out-of-the-box by some of the UI libraries (YUI for example).

When you start getting into async loading of javascript you need to have a developer look at it (unless you have the necessary development skills). t is well beyond the typical copy/paste of widget samples.
Visit this user's website Find all posts by this user
Quote this message in a reply
04-28-2010, 05:37 AM
Post: #14
RE: Green-Watch.org Site Optimization
I would make two sprites:

Sprite 1:
register.jpg
register1.jpg
bg_input_1.png
bt_logout.gif
bt_logout-over.gif
leaf.gif
menuIcons.png

Sprite 2:
buttonBG.png
bodyBG.jpg
..any other vertical gradients you have

Then if you really care about performance I would probably split bg_body3.jpg into two gradients and put them both in sprite 2. Sprite 2 could then be 1 pixel wide and would be extremely small file size wise. Both sprites should be PNG8's, and when they are done you should run them through smushit.

Combining the CSS files is probably not worth it like you say, since it would be a lot of work for little gain.

If you purchase a new domain (http://www.gwatchimg.com for example), host all of your static content there (images, JS, CSS), and never set cookies on it you will probably also see a performance improvement. Making all of these changes should have a significant impact on your page load time.

I would encourage you to this yourself. It will help you do this kind of thing in the future without having to rely on others or pay anyone, and you will have a deeper understanding of how this kind of change affects your site performance (and you will be able to evaluate the ROI).

I'm heading out of town for a few days and won't be monitoring this thread, but I wish you the best of luck. I'm sure that Pat can answer any questions you have (I'm fairly new to this forum after all Smile).
Visit this user's website Find all posts by this user
Quote this message in a reply
04-29-2010, 12:30 AM
Post: #15
RE: Green-Watch.org Site Optimization
Hey there,

The necessary JavaScript changes are starting to come along. Looks like inline JavaScript are causing a lot of race conditions. However, at least they are fixable and in the long run this will all pay off.

(04-28-2010 05:15 AM)pmeenan Wrote:  Yes, but you will need to have different sprite images for the repeat-x and repeat-y sprites.

I think I will outsource this small project on rentacoder. I love hardcore programming and I think this would better be left to a designer (:

(04-28-2010 05:15 AM)pmeenan Wrote:  This is where "advanced optimization" collides with the easy recommendations for typical sites

Let me shoot some statements and others can write back if they want and say if I am right or wrong in my assumptions.

#1 - Several JavaScript files loading asynchronously will load faster in most cases (where the files are large) opposed to one huge JavaScript file loading asynchronously.

#2 - The exception to statement #1 would be when the initial connection or DNS lookup takes a large amount of time for one or more of the JavaScript files.

#3 - All asynchronous JavaScript files will be loaded by the time the "document complete" message gets sent. I am wondering this because I could use a "body onload" script that assumes libraries are loaded.

#4 - When loading JavaScript asynchronously, can XMLHttpRequest load scripts from different domains or does it have to be the local domain? What about subdomains? I am wondering this because I can get rid of cookie data in the request if either a different domain or subdomain can be used.

#5 - Do CSS style sheets block other resources from being downloaded like synchronous JavaScript does?

#6 - Can CSS style sheets be loaded asynchronously or would that cause the rendering to look very strange? I do realize I can GZip my style sheet but I am wondering about other possible tweaks with the CSS sheet.

Thanks once again for any information.

Sincerely,
Travis Walters
Find all posts by this user
Quote this message in a reply
04-29-2010, 01:08 AM
Post: #16
RE: Green-Watch.org Site Optimization
(04-29-2010 12:30 AM)green-watch.org Wrote:  #1 - Several JavaScript files loading asynchronously will load faster in most cases (where the files are large) opposed to one huge JavaScript file loading asynchronously.

#2 - The exception to statement #1 would be when the initial connection or DNS lookup takes a large amount of time for one or more of the JavaScript files.

Probably not. What it will give you is the ability to enable pieces of functionality separately but the overall time will be longer. If they are all being served from the same domain as the rest of the resources more requests will also consume the limited connections to the server.

(04-29-2010 12:30 AM)green-watch.org Wrote:  #3 - All asynchronous JavaScript files will be loaded by the time the "document complete" message gets sent. I am wondering this because I could use a "body onload" script that assumes libraries are loaded.

Possibly, but it depends on how you inject the javascript. If you use XMLHttpRequest then it will not guarantee delivery before doc complete (and you'll want to use a callback handler to execute code when it finishes).

You should probably be looking at the methods that manipulate the DOM and just insert the reference to the code though. Then you can just include whatever you want executed at the end of the code that gets loaded asynchronously and it will be guaranteed to execute when the file is loaded and evaluated.

(04-29-2010 12:30 AM)green-watch.org Wrote:  #4 - When loading JavaScript asynchronously, can XMLHttpRequest load scripts from different domains or does it have to be the local domain? What about subdomains? I am wondering this because I can get rid of cookie data in the request if either a different domain or subdomain can be used.

I wouldn't recommend using XMLHttpRequest to do your javascript loading. The Google analytics snippet code has a good (well tested) example here: http://code.google.com/apis/analytics/do...cking.html

Just modify it to load your code instead of the analytics code (and remove the analytics-specific variables).

(04-29-2010 12:30 AM)green-watch.org Wrote:  #5 - Do CSS style sheets block other resources from being downloaded like synchronous JavaScript does?

Not directly but usually everything in the head will block loading of anything in the body (and you're still constrained by the number of simultaneous connections per domain).

(04-29-2010 12:30 AM)green-watch.org Wrote:  #6 - Can CSS style sheets be loaded asynchronously or would that cause the rendering to look very strange? I do realize I can GZip my style sheet but I am wondering about other possible tweaks with the CSS sheet.

You probably don't want to do that because the user experience would bee pretty bad. The CSS is already loaded asynchronously (with other content in the head anyway) so all you would really be doing would be to delay the displaying of the styled version of the page.

If you want to get REALLY fancy what you could do is inline the CSS directly into the HTML for the initial visit and reference external cached files for repeat visits. In practice you would implement it something like this:

- When the page is visited, if the "css cached" cookie is set just reference the css files normally. If not, put it inline and add the delayed loader code to the page

- The delayed loader would dynamically create a hidden (or 1x1) iFrame a few seconds after the page has loaded that references a special CSS caching page.

- The CSS caching page would be a blank HTML page that externally references your CSS files and sets the "css cached" cookie.

It's a fair bit of work so you'd have to REALLY want those few extra milliseconds though.
Visit this user's website Find all posts by this user
Quote this message in a reply
04-29-2010, 02:22 AM
Post: #17
RE: Green-Watch.org Site Optimization
Hey there,

Thanks for the response.

(04-29-2010 01:08 AM)pmeenan Wrote:  Probably not. What it will give you is the ability to enable pieces of functionality separately but the overall time will be longer. If they are all being served from the same domain as the rest of the resources more requests will also consume the limited connections to the server.

From the test I ran, I had two JavaScript files loading asynchronously. The initial connection and the time to the first byte were about the same. One script took 0.4 seconds and the other script took 0.7 seconds to load. Since both scripts were loading at the same time, the time it took for both to load was 0.7 seconds. If both JavaScript files were merged, it would have taken 0.4 seconds longer. It appears to be beneficial to load multiple asynchronous files as opposed to asynchronously loading a merged file when the files are quite large.

With my website, all the features do not need to be loaded on every single webpage. I save 155 KB of uncompressed downloaded (41 KB compressed) material just by leaving out the protoaculous JavaScript library.

(04-29-2010 01:08 AM)pmeenan Wrote:  Possibly, but it depends on how you inject the javascript. If you use XMLHttpRequest then it will not guarantee delivery before doc complete (and you'll want to use a callback handler to execute code when it finishes).

I was currently injecting the JavaScript this way:

Code:
function importProtoaculousJS()
{
   xmlProtoaculousJS=GetXmlHttpObject();

   if (xmlProtoaculousJS==null){return}

   url="http://www.green-watch.org/javascript/protoaculous.cfm";
   xmlProtoaculousJS.open("GET",url,true);

   xmlProtoaculousJS.onreadystatechange=function()
   {
      if (xmlProtoaculousJS.readyState==4 && xmlProtoaculousJS.responseText != '')
      {          
         var headID = document.getElementsByTagName("head")[0];        
         var cssNode = document.createElement('script');
         cssNode.type = 'text/javascript';
         cssNode.innerHTML = xmlProtoaculousJS.responseText;
         headID.appendChild(cssNode);
      }    
   }    

   xmlProtoaculousJS.send(null);
}

It looks extremely close to the Analytics code you sent except just use src for the node instead of using innerHTML.

Does loading the JavaScript via the Analytics method guarantee the JavaScript libraries will be loaded by the time the body onload event triggers?

(04-29-2010 01:08 AM)pmeenan Wrote:  Not directly but usually everything in the head will block loading of anything in the body (and you're still constrained by the number of simultaneous connections per domain).

That is something I did not know about head elements blocking before the body code starts executing. So it may be beneficial to start loading that asynchronous JavaScript in the body rather than the head so images can download at the same time. I will have to play around with this a bit.

(04-29-2010 01:08 AM)pmeenan Wrote:  If you want to get REALLY fancy what you could do is inline the CSS directly into the HTML for the initial visit and reference external cached files for repeat visits.

I agree about the user experience being a top priority. Making inline CSS would make style changes a bit more difficult. I am not sure a few milliseconds would be worth all that hassle - maybe for some people though (:

Thanks again for all the suggestion and advice. I love learning about this stuff and implementing ways to make my site better.

Sincerely,
Travis Walters
Find all posts by this user
Quote this message in a reply
04-29-2010, 08:49 PM
Post: #18
RE: Green-Watch.org Site Optimization
Hey There,

I changed my JavaScript importation method to the one you described.

Code:
function importZapatecMenuJS()
{
   var headID = document.getElementsByTagName("head")[0];          
   var zpmenuNode = document.createElement('script');
   zpmenuNode.type = 'text/javascript';
  
   zpmenuNode.onload = function()
   {
      var myMenu = new Zapatec.Menu
      ({
         theme: "/zapatec/zpmenu/themes/barblue.css",
         source: "menu-items"
      });

      var myMenuBar = document.getElementById("menu");
      myMenuBar.style.display = "inline";      
   };
  
   zpmenuNode.onreadystatechange = function()
   {
      if (zpmenuNode.readyState == 'complete' || zpmenuNode.readyState == 'loaded')
      {
         var myMenu = new Zapatec.Menu
         ({
            theme: "/zapatec/zpmenu/themes/barblue.css",
            source: "menu-items"
         });

         var myMenuBar = document.getElementById("menu");
         myMenuBar.style.display = "inline";          
      }
   };
  
   zpmenuNode.src = 'http://www.activegreen.net/javascript/zpmenu.cfm';
   headID.appendChild(zpmenuNode);        
}

From the tests I have ran, both this method and the XMLHttpRequest method do not guarantee that the JavaScript library will be loaded when the body onload event triggers. This is not an issue since I can use onload events, but it is something to aware of for future reference. Also when using the XMLHttpRequest method I tried inserted the JavaScript by innerHTML. This only appeared to work with Firefox.

I am happy to say that under Google Webmaster Tools, there has been a 3.5 second decrease on average loading time already. My crawl stats still show a high time to download a page. I think that is because of my robots.txt file blocking directories. I am going to adjust that now.

There is still a lot of inline JavaScript causing issues. I am working to resolve those issues now as well.

More updates to come.

Sincerely,
Travis Walters
Find all posts by this user
Quote this message in a reply
04-29-2010, 09:08 PM (This post was last modified: 04-29-2010 09:12 PM by pmeenan.)
Post: #19
RE: Green-Watch.org Site Optimization
Great.

btw, another way to avoid the race condition would be to put a function call at the bottom of each of the imported javascript files (potentially variable-based). For example, at the bottom of the prototype file put a call in for "prototypeloaded()" if you need to execute some code as soon as it loads. Functionally equivalent to the onLoad method you're using.
Visit this user's website Find all posts by this user
Quote this message in a reply
04-30-2010, 03:54 AM (This post was last modified: 04-30-2010 03:54 AM by green-watch.org.)
Post: #20
RE: Green-Watch.org Site Optimization
Hey there,

For those using asynchronous JavaScript, I modified I function I found on the internet that allows you to add events to the body onload event. It takes whatever onload event you had initially and adds another event to it so both the new event and the other body events get loaded. It also takes into consideration if you add an event and the onload event has already been triggered, it executes the event sent to the function.

Code:
var bodyHasLoaded = 0;

function addLoadEvent(func)
{
   if (bodyHasLoaded==1)
   {
      func();
   }
   else
   {
      var oldonload = window.onload;
  
      if (typeof window.onload != 'function')
      {
         window.onload = func;
      }
      else
      {
         window.onload = function()
         {
            if (oldonload)
            {
               oldonload();
            }
        
            func();
         }
      }
   }
}

If anyone uses this, make sure you set bodyHasLoaded=1 when the body onload event triggers.

On another note, I do have a question. How do you enable keep-alives for coldfusion .cfm pages? The test results on this website always show I have keep-alives enabled for everything but .cfm pages.

Sincerely,
Travis Walters
Find all posts by this user
Quote this message in a reply
Post Reply 


Forum Jump:


User(s) browsing this thread: 1 Guest(s)