Current time: 12-13-2017, 02:00 AM Hello There, Guest! (LoginRegister)

Post Reply 
 
Thread Rating:
  • 0 Vote(s) - 0 Average
  • 1
  • 2
  • 3
  • 4
  • 5
Green-Watch.org Site Optimization
04-25-2010, 08:34 PM
Post: #1
Green-Watch.org Site Optimization
Greetings,

My name is Travis Walters. I am the owner of Green-Watch.org. It is in the process of becoming an environmental website. People will be able to connect with green businesses, purchase green products, and learn how to enrich their life with new ideas.

First off, I would like to say that I ABSOLUTELY LOVE this page speed testing website. I have been a programmer for about 5 years now and I have not really taken page speed into consideration into building my websites until recently.

The first time I ran the page speed tool on this website I found there to be a large amount of time to get to the first byte of any webpage on my website (for the first run). I did not understand what this first byte concept ment. The website always loaded fast for me on subsequent runs so I did not know the problem existed. I ended up running a blank coldfusion file through the page speed tester and that first byte time was still there. That is when I realized this must be the time it takes for the Application.cfm file to process queries. On the first run, my website captures the IP address of a user to find geographic location to give better search results. Instead of returning the one result it needed right away, it was reading the entire database table. It was very inefficient so I fixed that issue and the time to the first byte is now a lot less.

* One suggestion that would be nice for webpagetest.org is an easy-to-find description of what the first byte is. If the first byte takes longer than 2 or 3 seconds, a hint might pop-up that says you might want to check your database queries to make the first byte take less time.

I also combined about 5 javascript files that were in the header section of my webpages. I noticed over an entire second came off the load time. I guess this is because browsers wait until JavaScript files are loaded before processing the rest of the page? The merged JavaScript file is not minified or compressed yet. I will see what impact that has on page speed shortly.

I also plan to come up with a large list of questions I have about page speed that may enlighten other programmers that are just learning about this page speed concept.

Sincerely,
Travis Walters
Find all posts by this user
Quote this message in a reply
04-26-2010, 12:01 AM
Post: #2
RE: Green-Watch.org Site Optimization
Thanks. I've been debating to add a "back end performance" checklist item to catch egregiously slow first byte times. The main hangup is in figuring out what a reasonable threshold would be. I'm probably going to make it use a multiple of the socket connect time which should be the bare round-trip time and then flag anything that takes longer than 2 or 3x that for the request time.

Even though it is more common that the front-end components are causing performance issues, I've seen enough cases (particularly with WordPress installs) where the first byte time is ridiculously long that it should be brought to people's attention.

The part that REALLY needs work is the documentation to tell people how to fix various things. I have a wiki set up that I am starting to populate with content. Once it has enough to be useful I'll be exposing that as well.

Thanks,

-Pat
Visit this user's website Find all posts by this user
Quote this message in a reply
04-26-2010, 12:25 AM
Post: #3
RE: Green-Watch.org Site Optimization
Hey there,

So I have most of my JavaScript merged into one file:

http://www.green-watch.org/javascript/mySiteScript.cfm

Neat little trick for coldfusion developers:
Code:
<cfset dExpiryDate = CreateODBCDateTime(DateAdd("d",7,Now()))>
<cfheader name="Expires" value="#GetHttpTimeString(dExpiryDate)#">

<cfif cgi.HTTP_ACCEPT_ENCODING contains "gzip">
  <cfheader name="Content-Encoding" value="gzip">
  <cfcontent type="application/x-javascript" deletefile="no" file="#expandpath('compressed/mySiteScript.js.gz')#">
<cfelse>
  <cfinclude template="uncompressed/mySiteScript.js">
</cfif>

The GZip file is 88 KB and I would assume most people use GZip-Enabled browsers. The uncompressed file is 366 KB. If you look at the source code for the file, is there anyway to condense this size even more? I tried using the Dean Edwards Packer on http://jscompress.com/ but it seemed to give me errors on the website.

I would gladly PayPal $10 bucks to the first person who replies with a lot more compressed (yet functional) version of this javascript file. If you take me up on that offer, please include the javascript file, your paypal email address, and directions on how to take the uncompressed version and make it compressed. I need to be able to duplicate the method for future code changes.

Here are a few questions for the community:

#1 - YSlow gives me a grade E on "Add Expires headers". When using external images from another domain, is there anyway to set an expire header, or does the image have to be downloaded to my server first?

#2 - Do you recommend using CSS sprites to reduce the # of HTTP Requests? If anyone is interested, I have a project that involves creating one. I have about 7 images that are used as background images. These probably could be combined into a sprite. I would need the sprite created and the css modifications necessary for the background to appear identical to the way it is now. Let me know if you are interested.

#3 - What are ETags and do you recommend using them? I see YSlow docks you for not having them.

#4 - YSlow also docks me for having too many DOM elements. Is this a common problem? Should I be concerned with reducing the number of elements on my webpages?

#5 - Google Page Speed says I can remove a lot of unused CSS on my home page. I thought having all CSS in one location is a good thing? The cacheable file would only have to be downloaded once and it is only a single HTTP request?

#6 - How do you fix the following error?

Quote:Due to a bug in some proxy caching servers, the following publicly cacheable, compressible resources should use "Cache-Control: private" or "Vary: Accept-Encoding"

* http://www.activegreen.net/inc/style.css
* http://www.green-watch.org/javascript/mySiteScript.cfm
* http://www.green-watch.org/javascript/styles.js

Consider adding a "Cache-Control: public" header to the following resources:

* http://content.officemax.com/catalog/ima...28p_01.jpg
* http://images.barnesandnoble.com/images/...269700.JPG
* http://images.barnesandnoble.com/images/...568557.JPG

#7 - Using the webpagetest.org speed test, how come there is a gap after the page completes? There looks like there is a 2 second interval where nothing is downloaded at all here:

http://www.webpagetest.org/result/100425...1/details/

#8 - In Google Webmaster tools, are there page speed estimates based on document completed or fully loaded?

Thanks in advance for any information.

Sincerely,
Travis Walters
twalters84@hotmail.com
admin@green-watch.org
Find all posts by this user
Quote this message in a reply
04-27-2010, 02:43 AM
Post: #4
RE: Green-Watch.org Site Optimization
It looks like your JS file is already minified, so you probably aren't going to be able to compress it further. Your best bet on that front may be moving to some sort of JavaScript library where people have already optimized code to a high degree. YUI is a great one, and it allows you to load JS on demand which can dramatically boost load time.

As for your questions:

#1 - You would need to download the images to your server in order to configure the expires header.

#2 - Yes, you should most likely sprite your background images. I'm happy to help with that.

#3 - There is a great description of ETags here. in most cases you do not want them.

#4 - Reducing the number of DOM elements will make your JS execute faster and make the browser paint the page faster. You are barely over 1000 so you aren't in terrible shape but reducing the number would definitely help.

#5 - It's a balancing act, you have a very small amount of CSS so I think you will be better served by keeping it in one file - you would definitely see an improvement by minifying and gzipping this file however.

#6 - You would need to configure your webserver to send out different HTTP headers with that content. You cannot configure headers for the external images (see #1).

#7 - Gaps like that are usually due to JavaScript executing. You have quite a lot of JavaScript, so that could easily be the problem.

#8 - They are looking for the Document Complete event - all of their data comes from people using the Google Toolbar who have opted in to the advanced features.
Visit this user's website Find all posts by this user
Quote this message in a reply
04-27-2010, 12:05 PM
Post: #5
RE: Green-Watch.org Site Optimization
Hey Jklein,

(04-27-2010 02:43 AM)jklein Wrote:  Your best bet on that front may be moving to some sort of JavaScript library where people have already optimized code to a high degree.

The JavaScript file on my website currently includes:

- JavaScript that I wrote myself to handle common things on the website
- SwfObject Code
- Zapatec JavaScript Menu
- Scriptaculous JavaScript Library + Prototype.js
- Tooltip Library Based on Prototype.js

The Zapatec JavaScript Library is about 180 KB uncompressed and I think the Prototype+Scriptaculous Library is around the same size.

Depending on the price, I would not mind replacing the Zapatec menu with a CSS based menu if the functionality can be the same.

Scriptaculous+Prototype JavaScript Library could be replaced if a tooltip system, autocompleter, and a draggable element system were created.

I am not sure how good you are at JavaScript, but if you are interested, please feel free to give me a quote on all or some parts.

(04-27-2010 02:43 AM)jklein Wrote:  Yes, you should most likely sprite your background images. I'm happy to help with that.

That is something I would most definitely like to do. How much would you charge for this? I believe there are about 8 background images.

However, if we can get the menu CSS merged with the main style sheet, the new sprite could be merged with the sprite for the menu.

(04-27-2010 02:43 AM)jklein Wrote:  It's a balancing act, you have a very small amount of CSS so I think you will be better served by keeping it in one file - you would definitely see an improvement by minifying and gzipping this file however.

Gzipping the style sheet is something I am definitely going to do. I have been making a few minor style sheet adjustments recently.

(04-27-2010 02:43 AM)jklein Wrote:  They are looking for the Document Complete event - all of their data comes from people using the Google Toolbar who have opted in to the advanced features.

In the webmaster tools, it is saying my page speed spiked to about 14 seconds so I am really concerned with page speed right now. I have adjusted the robots.txt file so only the main directory is exposed to crawlers. I figured I could find the script causing the page speed to increase drastically by doing that. However, I suppose it would not do that if all their data comes from the toolbar.

Thanks again for the reply and I look forward to hearing from you again.

Sincerely,
Travis Walters
Find all posts by this user
Quote this message in a reply
04-27-2010, 05:58 PM
Post: #6
RE: Green-Watch.org Site Optimization
Hello Again,

Few more questions for the community...

(04-27-2010 02:43 AM)jklein Wrote:  They are looking for the Document Complete event - all of their data comes from people using the Google Toolbar who have opted in to the advanced features.

If Google is looking for a document complete event, what would stop a website from doing something like:

TestPage.cfm:

Code:
<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd">
<html xmlns="http://www.w3.org/1999/xhtml">

<head>

  <meta http-equiv="Content-Type" content="text/html; charset=utf-8" />
  <title>Page Title</title>
  
  <script type="text/javascript">
  var myDirectory = "http://" + document.domain;

  function GetXmlHttpObject(handler)  
  {
     var objXMLHttp=null
  
     if (window.XMLHttpRequest)
     {
         objXMLHttp=new XMLHttpRequest()
     }
     else if (window.ActiveXObject)
     {
         objXMLHttp=new ActiveXObject("Microsoft.XMLHTTP")
     }
  
     return objXMLHttp
  }  
  
  function importPage()
  {
     xmlCheckHTML=GetXmlHttpObject();
  
     if (xmlCheckHTML==null){return}

     var myElement = document.getElementById("myImportedPage");

     url=myDirectory+"/testPage2.cfm";
     xmlCheckHTML.open("GET",url,true);

     xmlCheckHTML.onreadystatechange=function()
     {
        if (xmlCheckHTML.readyState==4 && xmlCheckHTML.responseText != '')
        {
           myElement.innerHTML = xmlCheckHTML.responseText;
           myElement.style.display = "block";
        }    
     }    
  
     xmlCheckHTML.send(null);
  }  
  </script>
  
</head>

<body onload="importPage();">

  <div id="myImportedPage"></div>

</body>

</html>

TestPage2.cfm

Code:
This is the page contents that should be loaded after the document complete signal is sent. Anything at all could go here...

The document complete message would be sent almost instantaneously only after the initial connection and DNS look up and then the entire contents of the page would be sent after the complete.

Unless I am interpreting what you mean wrong, from Google's perspective, I would consider this a black hat trick for page speed.

On my home page, I do have images loading after the document complete message gets sent. This improves the user experience in my opinion. Users do not have to wait for these images to download and can still view the rest of the page.

Will Google setup some sort of guidelines to say what can and can not be loaded after document complete? Where do they draw the line? How do they enforce something like that? People always do black hat tricks to get ahead in the SERPs - I am not one of them.

Thought this would be an interesting discussion especially since Google is starting to factor in page speed in its algorithm, even if it is just 1 percent of the calculation.

Sincerely,
Travis Walters
Find all posts by this user
Quote this message in a reply
04-27-2010, 09:15 PM
Post: #7
RE: Green-Watch.org Site Optimization
Hey there,

So I am looking around seeing what is available to replace the huge javascript libraries on my website.

Autocompleter:

http://www.dhtmlgoodies.com/index.html?w...namic-list

I could probably get that working by myself.

If anybody wants to work on a CSS based tooltip and / or menu please let me know. Job opportunity is available here.

Sincerely,
Travis Walters
Find all posts by this user
Quote this message in a reply
04-27-2010, 10:16 PM
Post: #8
RE: Green-Watch.org Site Optimization
Two quick suggestions:

1 - Not sure what your skillset is but you may be able to do the sprites yourself. Doesn't sound like you have a complicated implementation and SpriteMe may be able to do all of the heavy lifting for you. It is a firefox bookmarklet that can automatically generate css sprites and will even give you the css changes to get it working.

2 - You may not need to completely replace your javascript libraries. You could probably get a lot of the benefit just by moving them out of the head, loading the libraries asynchronously and using progressive enhancement to add the functionality to the page.

Here are a few blog articles that may help:

http://www.artzstudio.com/2008/07/beatin...ronous-js/

http://www.stevesouders.com/blog/2009/04...-blocking/
Visit this user's website Find all posts by this user
Quote this message in a reply
04-27-2010, 11:17 PM (This post was last modified: 04-27-2010 11:18 PM by jklein.)
Post: #9
RE: Green-Watch.org Site Optimization
(04-27-2010 10:16 PM)pmeenan Wrote:  Two quick suggestions:

1 - Not sure what your skillset is but you may be able to do the sprites yourself. Doesn't sound like you have a complicated implementation and SpriteMe may be able to do all of the heavy lifting for you. It is a firefox bookmarklet that can automatically generate css sprites and will even give you the css changes to get it working.

2 - You may not need to completely replace your javascript libraries. You could probably get a lot of the benefit just by moving them out of the head, loading the libraries asynchronously and using progressive enhancement to add the functionality to the page.

I agree 100% with both of Pat's points - I had no intention of charging you for making sprites. There are good online tools like SpriteMe, and it's fairly easy to just make them yourself with a free tool like Paint.NET. You have few enough background images that this wouldn't take long at all.

The main reason why I was advocating using a library like YUI was because you can probably get all of the same functionality you have now with much less overall JS. Loading scripts asynchronously would help, but reducing the overall amount of JS will get the page to a fully usable state faster (and YUI has the asynchronous loading build in).

As for the Google Page Speed bit, like you say the robots.txt file has no impact on the load time shown in webmaster tools. You are also correct that in theory you could load your entire page asynchronously and thus force the document complete event to fire early. There are two reasons why you likely wouldn't want to do this:

#1 - This would likely increase the overall load time of the page (in real life), since you are creating an artificial HTTP request to fetch the page content, and it would be faster to just load it all on the first page load.

#2 - While Google has said that they are using page speed for search ranking they also say that they "use a variety of sources to determine the speed of a site relative to other sites", so improving your numbers in the webmaster tools report doesn't necessarily help your search ranking, it just gives you junk data.
Visit this user's website Find all posts by this user
Quote this message in a reply
04-28-2010, 02:02 AM
Post: #10
RE: Green-Watch.org Site Optimization
Hey there,

Thanks for the responses. I have a few more questions.

(04-27-2010 10:16 PM)pmeenan Wrote:  Not sure what your skillset is but you may be able to do the sprites yourself. Doesn't sound like you have a complicated implementation and SpriteMe may be able to do all of the heavy lifting for you. It is a firefox bookmarklet that can automatically generate css sprites and will even give you the css changes to get it working.

I did create a menu icon sprite for all the 16x16 icons in the javascript menu. My skill set is quite large but CSS is probably my least favorite thing to do. I did look at that SpriteMe application. It did not recommend making a sprite out of background images due to some being JPEG images and others repeated themselves. JPEGS can easily be converted but can a sprite handle repeat-x and repeat-y images?

(04-27-2010 10:16 PM)pmeenan Wrote:  You may not need to completely replace your javascript libraries. You could probably get a lot of the benefit just by moving them out of the head, loading the libraries asynchronously and using progressive enhancement to add the functionality to the page.

When people say javascript files should be merged together into one file, do they mean all synchronous javascript should?

Let me shoot an idea and tell me what you think.

- Currently all JavaScript is contained in one huge file. It is minimized, gzip compressed, and takes about one second to load your FIOS test. It is contained in the head section so it does block the website.

- This JavaScript file contains form functions, prototype.js, scriptaculous.js library, prototip library, swfobject, and zapatec.js menu.

- I could setup five functions: importFormJS(), importPrototypeJS(), importScriptaculousJS(), importPrototipJS(), and importZapatecMenuJS()

- In each function I could have a XMLHttpRequest that would request gzip files of the libraries. The XMLHttpRequest would be asynchronous so it should not block other resources from being downloaded correct?

Would this method work to stop (or reduce) blocking? Would creating these requests be a bad thing to do? Maybe I am thinking of the wrong way to do this.

Anything information will be greatly appreciated. Thanks again!

Sincerely,
Travis Walters
Find all posts by this user
Quote this message in a reply
Post Reply 


Forum Jump:


User(s) browsing this thread: 1 Guest(s)