Making your web pages fast (part three)

Now that we’ve seen that it’s perception that defines how your users grade the speed of your webpages (although I’m not going to argue that spending a good deal of time speeding things up in an absolute sense will not go amiss), and how to analyze the network traffic that goes into displaying your pages (one, two), it’s time to look for solutions to the performance issues we saw.

Finally, the fruits of success!

Finally the fruits of all this work!

Number of files

A simple one, this; it even belongs in the duh! category. Reduce the number of files that your markup requires. As I said, simple.

OK, a bit more information. My classic car website required 5 CSS files to be downloaded, all from my domain. All of them were marked as media=”all” so they can all be concatenated one onto another in the same order with no ill effects. By doing so, I am (A) reducing the work that my web server needs to do (for every reader visiting the site they will download just one file, instead of issuing five requests at the same time), and (B) reducing the work that the browser has to do, especially if it batches requests. A win all round. The only issue is that I have to incorporate a concatenation process into my web deployment script, and that is, after all, very simple.

Next, my webpage requires 14 JavaScript files to be downloaded. So, instead of 14 requests, batched, I could have one request for a concatenated file.

There are a couple of issues to bear in mind here. The first one is that by doing so you are sidestepping any obvious caching that can occur. In my example from last time, the jQuery script file was being downloaded from a well-known CDN URL. Obviously, if I visit a lot of sites that also download that same file from the same URL, the file will be cached in the browser. It’ll only be downloaded once. If, however, I concatenate it into a big ol’ JavaScript file for my site, I will only benefit from caching provided the reader has visited my webpage at least once. This is one of those things that you will have to experiment with and measure, to be honest.

The next one is a little more obscure. We can call it the Strict Mode problem. Strict mode? These days we are encouraged to use the “use strict”; pragma in our code. This tells the browser’s JavaScript interpreter to use Strict Mode when parsing the code and executing it. Strict mode helps developers out in a few ways:

There are two scopes to strict mode: the whole file – what you might call Global Strict Mode – and per function. The issue with creating concatenated JavaScript is that some script file of the set may invoke Strict Mode globally, meaning that every other script file after it will need to obey Strict Mode or face the consequences. You may find that the unconcatenated JavaScript works fine, but the concatenated JavaScript does not. So, test. Mind you, these days, the recommendation is to only apply strict mode at a function level, not at the file level, but please make sure.

Again, all this solution requires is that you have a special concatenation task in your deployment script.

Size of files

Another simple solution: reduce the size of the files that have to be downloaded. Reduced file sizes means reduced time to download them. For JavaScript and CSS files, the easiest way to reduce the file sizes is to minify them. Or, equivalently, use the minified version of any library files that you are using.

What minification does is to remove unnecessary whitespace (and comments) and to rename private identifiers to smaller names, usually one character long. By doing this, some serious reductions in file sizes can be achieved. As an example, unminified jQuery 1.11 is 277KB, whereas the official minified version is 95KB, a third the size. Obviously if you are going to be minifying your JavaScript, you should do your debugging with the unminified code, because doing it the other way round is an exercise in complete frustration.

You can also minify your HTML, but, to be honest, I don’t know of many sites that do. After all, a lot of CRMs are going to be generating the page’s HTML on the fly from the content and a template; it’s not static HTML by any means. Therefore, you’d have to add a minification step to your web server’s codebase so that any page that is served up is then minified before being sent to the browser. The time savings of reducing the transit time to the browser versus the extra processing time needed on the server for just a single file are probably just not worth it.

There are many open-source minifiers out there. Personally I use the YUI compressor, but the big problem with it is that it’s built in Java, so you need to have that installed as well. .NET developers would probably go for the Microsoft Ajax Minifier (AjaxMin).

Obviously, if you are going to be minifying your code and CSS, it makes sense to concatenate them afterwards as well to produce a single minified CSS file and a single minified script file.

Optimizing images

Once you have minified and concatenated your code, don’t stop there. One extra trick that not many people do is to optimize your images.

The very first optimization is to add image width and height attributes to the IMG tags in your page markup. This simple change has one huge benefit: the browser can kick off a request to download an image, but still knows how big the image will be to display. The browser can reserve space in the rendered display for the image when it finally arrives. The big bonus here is that the browser does not have to wait to get the image before continuing rendering the page, or -- equally -- doesn’t just “guess” the image size (say by using the space occupied by the default “cannot load image” icon) and then have to re-render the page once the size of the image is known. To the user, the page just loads more smoothly (and hence is perceived as quicker). Luckily most blog post editors will calculate the image sizes for you and insert them into the markup (Windows Live Writer, which I use for all my blogs, does this).

Optimizing JPG photos? Not really. The JPEG format is lossy, so, in theory you’d be accepting a “fuzzier” image for a reduction in file size.

Optimizing PNG images? Now there’s a real possibility for improvement: PNGs use lossless compression rather than JPEG’s lossy compression, so all you have to do is tweak the DEFLATE compression “knobs” to find a better compression of a particular image. Yes, the DEFLATE algorithm is tunable to a certain extent. One of the better algorithms for compressing PNG files is PNGOUT by Ken Silverman. I use a commercial version of it called PNGOUTWin. There are others; TinyPNG, for example, has a plug-in for Photoshop.

Scripts in the markup

As it happens, last time we discussed one of those performance gains: putting your script elements at the end of the markup, just before the </body> tag.

The reason for doing this is that, unless we explicitly mark the script with the async attribute in the markup, the browser will stop parsing the markup, request and download the script (if it’s external), then compile and execute the code. For an external script, the browser has to issue a DNS request to get the IP address, construct a request packet and send it to that IP address, wait for and download the response (which presumably has the script file), compile the script, call any entry points that can be immediately executed, before it can continue parsing the markup. These types of script elements are known as blocking scripts, because, well, they block the browser from doing what the reader wants: displaying content. And note that even if your script looks like this:

$(function ($) {
  "use strict";
  // code code code
});

…it’s still going to have to execute that call to the jQuery $(document).ready() function.

(As an example of what I’m talking about here, Steve Souders has created a simple webpage where it takes a full 10 seconds to download the script file (select “Rule 6” then “Example 2” to see it in action). You can view the page with the script element in the head, and with it at the bottom of the markup. You can observe directly what I’m talking about when I discuss perceived performance.)

How to avoid these blocking scripts? By far the easiest option is to put them at the end of the markup as I discussed. That way they continue to block, sure, but only after the content has been displayed to the reader. This is one of those perceived performance improvements: overall the time taken to fully display the functioning page has not changed, it’s just that the reader is unaware of it because they can start scanning/reading/interacting with the page earlier.

Another option for current browsers (and by that I mean anything reasonably modern, desktop or mobile, but for IE it means IE10 or later), is to mark the script elements as async. This is an HTML5 attribute, so don’t look for it in markup before that (but then again you are using HTML5, right? Right?)

    <script async src="http://example.com/js/utility.js"></script>

What this does is to instruct the browser to initiate a download, but that it’s not required to block until the script has downloaded. Sounds ideal but it comes with its own caveats. Say, for example, I have script A and script B, with B building on something in A. If they’re blocking scripts, the browser will execute them in the order you specify:

    <script src="http://example.com/js/A.js"></script>
    <script src="http://example.com/js/B.js"></script>

If however I mark both as async then I no longer know in which order the scripts will be downloaded and executed. B could be downloaded and executed before A for example. So I have to alter B in some way to ensure that A is present and correct (such as using RequireJS, for example).

Another option, for smaller scripts, say, is to inline the script into the markup. Yes, I just told you to copy and paste. Horror! As software developers we intimately know in our bones how this can go wrong: you want to make a change to the script and now you have to identify … every … single … webpage into which it was inserted. Forget one and boom! your site is dead. Let’s just say, I’ve never reached this particular point in my web development career.

Domain sharding

If you recall, in the previous installment, we observed batching in the file download requests, at least in Firefox. The browser was batching the number of requests in groups of five or six at a time and would only release the next batch of requests once the previous batch had completed. This is done for two reasons: to minimize the resources needed in the browser and also at the server. After all, for a popular website, the number of requests coming in at any time would be large, so batching might help make the server more efficient at the cost perhaps of making the performance at the browser slightly slower.

One way to make batching work for you at the browser is to separate the files that need to be downloaded onto different domains. We saw that earlier with the webpage downloading jQuery from a CDN rather than the website’s domain. At first glance, such a strategy – storing your files on several hostnames – might speed things up, especially for webpages that have a lot of files to download (we assume that you haven’t concatenated your scripts and CSS). This strategy is known as domain sharding.

In reality, it seems that having more than, say, three hostnames doesn’t really speed anything up at all. For each new domain, the browser must do a DNS lookup to find the server’s IP address. That takes time too. Better is to try and put as much on the main domain hostname as you can (only one DNS lookup required!), with images stored on some other service (say, AWS), and possibly getting your JavaScript libraries from their respective CDNs. To be brutally honest, though, I don’t see much benefit these days from domain sharding; maybe in earlier years it was more important than it is now. By all means try it out, but I think you’ll find the other suggestions here far more important and effective.

Summary

To improve the perceived performance of your webpages, try these recommendations:

With those easy-to-make changes, you’ll find your readers will thank you for the speedier interactions with your site.

The fruits of success - banner

Now playing:
Incognito - Pieces Of A Dream
(from Best Of/20th Century)

Loading similar posts...   Loading links to posts on similar topics...

No Responses

Feel free to add a comment...

Leave a response

Note: some MarkDown is allowed, but HTML is not. Expand to show what's available.

  •  Emphasize with italics: surround word with underscores _emphasis_
  •  Emphasize strongly: surround word with double-asterisks **strong**
  •  Link: surround text with square brackets, url with parentheses [text](url)
  •  Inline code: surround text with backticks `IEnumerable`
  •  Unordered list: start each line with an asterisk, space * an item
  •  Ordered list: start each line with a digit, period, space 1. an item
  •  Insert code block: start each line with four spaces
  •  Insert blockquote: start each line with right-angle-bracket, space > Now is the time...
Preview of response