I hope I’m not re-inventing someone else’s wheel…

While coding a new page design, I finally got fed up with calling external JavaScript code which would use any form of document.write() method, since the page won’t “finish” loading from a client perspective until the JavaScript is finished being downloaded & executed.  Placing such scripts in the <head> of the page is even worse as the page won’t even begin to render until all pieces within the <head> are retrieved.

The new methodology designers are using is to embed AJAX (JavaScript + XMLHttpRequest object) functions to embed additional page content (by updating the content of tags with .innerHTML or other aspect of the DOM), since the page will finish rendering and not “lag” waiting for that code to execute.  However, this still doesn’t address the issue of the browser still having to wait for the library to get fetched & loaded in the first place.

I’m not certain if anyone else is doing this, but I decided to try a new method based on AJAX to further speed up page loading.  The caveat of the method is that a page might finish loading before some JavaScript features have fully rendered, but my feeling is that a faster-loading page in general is much more acceptable to a visitor than one that loads slower, even if the faster loading doesn’t display the enhanced functionality immediately.

Here is what I’m doing:

Let’s say you have a bit of AJAX functionality inside of an external JS file.  You still have to wait for that file to get fetched before the browser starts to execute it.  By embedding a simple AJAX function into the HTML itself, the JS file can be fetched via AJAX (essentially the same speed it would take the browser to do it anyway without AJAX).  Multiple JS files can be fetched in a single call.  Then, I wrap the JS inside of <script> tags, embed inside an empty <div> with an id of “ajax” and then loop through this code:

var div = document.getElementById('ajax');
var scripts = div.getElementsByTagName('script');
for (var i = 0; i < CCx.scripts; i++) {
  eval (scripts[i].text);
}

What this effectively does is allow the page to avoid waiting for external JS files to load before considering the page to be finished loading.  This may be more of an improvement on older browsers but even on Firefox 3.5 and Internet Explorer 8, I’ve noticed an improvement.  It also dodges an issue with Firefox causing pages to lag when the the Avast virus scanner is running on the client computer.

The main thing still to test is caching, as this method may be moot if the browser re-fetches the JS file on every page load, but I believe all modern browser treat an XMLHttpRequest as if it was requested normally by the browser and will cache those results.  If you don’t want caching, like if your JavaScript code output is server-side dynamic, such as generated by a CGI, then just append a random string as a URL parameter to force a fresh request every time.

bookmark & share:
  • Digg
  • del.icio.us
  • Reddit
  • StumbleUpon
  • Google
  • Facebook
  • YahooMyWeb
  • BlinkList
  • blogmarks
  • Fark
  • Furl
  • Live
  • Propeller
  • Pownce
  • Bumpzee
  • BlogMemes
  • Slashdot
  • Spurl
  • Technorati