WebPagetest Forums

Full Version: Is it possible to run ENTIRE site (not just images, etc) via CDN?
You're currently viewing a stripped down version of our content. View the full version with proper formatting.
Pages: 1 2 3 4 5 6 7
I know some CDN providers offer expensive Dynamic Site Acceleration products that serve entire sites, not just static files. But I wonder if a CDN could be configured so even the base html file is served via CDN rather than from the origin server. A DNS lookup time would be saved, and potentially also the connect & download times could be faster, especially in geographical locations far from the origin server.

Would it not be as simple as specifying my dedicated IP as my origin server, and then setting up www cname (instead of the typical cdn cname) as an alias to my CDN account?

I don't know much about these things, and that's why I'm most likely overlooking some obvious reason why this wouldn't work. But I've always meant to ask...
It depends on the cacheablility of your HTML. If you generate unique content for every user (or even detect logged-in state by cookie) then you can't but if the content is the same to all users, even for short periods then absolutely (and just like you planned).

You would need to add caching headers to your html responses for it to be useful but even short lifetimes (1-5 minutes) can have a big impact for a busy site.
I've been talking with Akamai recently and they have a service that is different from whole site delivery. From my understanding, you do some DNS manipulation that causes a users request for a page on your site to be directed to akamai, they then grab the data from your site (intial page code for example), and serve it through an optimized channel on the akamai network. That may be something worth checking into. Though, it does not do caching of the page code, just optimizes the delivery path.

p.s. On a related note, you can expect a comparison between MaxCDN and Akamai very soon. We're working on integrating akamai with our site, and when we do it should be pretty easy to switch back and forth between the two for testing. Should be interesting to see how they compare.
Sounds like DSA. Joshua Bixby (from Strangeloop) had a pretty good writeup on it here: http://www.webperformancetoday.com/2010/...eleration/

The main "benefit" is that they run the traffic back over tweaked network connections (that have been warmed up) and the front-ends generally have large initial TCP congestion windows (much like Linux 2.6.39). It's usually pretty freakishly expensive for what it is actually doing but for situations where the content is dynamic there really isn't much that you can do.

If you don't mind building vendor-specific functionality they also have ESI support (Edge-side includes) where you can basically cache a "template" at the edge and only the dynamic pieces are fetched at load time (varnish also provides a similar capability).
Jarrod, unless there is a very inexpensive reseller of the Akamai service, I can only envy... :-)

Pat, my site does serve different content to logged in users, so the caching of the html would not likely be possible, but if everything else would work, I think (I hope) it could still improve the site's performance. Perhaps this is something I should test one of these days.
Pat, don't worry, i was only mentioning it so as to make others aware of it's existence. Personally, it is not something I see as too useful for our business. Akamai initially tried to sell it to me for a "mere" $3,000 a month (aka $36,000 a year). We serve customers worldwide, but the benefit isn't worth the price. For now we're just sticking with their object delivery.
Marvin, don't envy us yet! I had them put a clause in the contract so that we can cancel the service within 90 days without issue. I plan on using that time to do plenty of testing to see if it really is worth the higher price. So, the envy will have to wait until the testing ;-)
If the content can't be cached then you might benefit a bit by avoiding the additional DNS lookups but there will also be a penalty for routing the base page through their edge servers (so the first byte times will be worse).
@jarrod, when you do the eval, make sure to use real user performance data (analytics) and not a backbone synthetic test (Keynote, Gomez, etc).

I'm sure you're probably aware, but the CDN providers are known to co-locate their edge nodes in the same facilities as the test agents for the testing services so things will look artificially fast unless you are using a last-mile product to do the testing (or using real data).
Yeah, I was going to utilize various webpagetest.org locations, as well as on site analytics. The main figure i'll watch is the time to first byte as well as watch for consistency at different locations. Our current cdn is a bit inconsistent actually, one of the things that prompted us to look elsewhere.
The change has been finally implemented. Now the entire www sub-domain, not just static files, is on my CDN. I do see a small, but a measurable improvement in both, the start render, and load, even in geographically close locations. In some far away locations, where my DNS look up times were pathetically slow, the improvement seems to be nothing short of dramatic. For example, in Australia, the best load times for my home page used to be around 1.4 to 1.5 seconds. Now I did 10 tests, and they all came at around 875 milliseconds.
Pages: 1 2 3 4 5 6 7
Reference URL's