In the previous episode of this series I discussed why you might want to speed up your web pages and how it is more about perceived performance, rather than absolute performance. However, this optimization, as with anything, comes with a cost. If you have a site that receives occasional use, then maybe you don't want to overdo the time and effort that these performance optimizations might entail. Or maybe what I'll be describing may not go far enough: in which case, I hope the analysis side of things helps you more.
Anyway, this first thing that needs to be done is to analyze the web page with the developer tools that come with your browser. Me, I prefer Firebug in Firefox for this work, but you can use Chrome's tools or even the new ones with Microsoft Edge. Whichever you choose, you need to open up the tool that displays and times the requests and responses initiated by the web page.
Once the developer tools are open, load (or reload) the page you wish to analyze. Unfortunately there's another catch here: ideally you want to clear the cache for that particular page. (For Firefox: load up the page so it’s first in your history list, then go to the History list, click Show All History, right-click the page you just loaded and select Forget About This Site.) Different browsers do this in different ways, but I've been known in the past to clear the entire cache for the browser as a whole. Nuke it from orbit, in other words; just to make sure it's all gone.
Then, it's the moment of truth: refresh the page. Let the refresh complete and you will have a display that looks something like this (click to enlarge, although it might be better to right-click and open in its own window):
This network analysis is for the blog I have for my Volvo 1800S. The content is served from an instance of GraffitiCMS on a shared GoDaddy hosting. I purchased a theme for the look and feel. I’ll note that the theme’s template is not particularly optimized, which is great for this series: we can see how it could be improved.
Let's look over these results. The very first item is the request for the main page itself, every other request is going to be triggered because of something in that page's markup: CSS files, script files, images, and so on. So the very first possible optimization you could make is to improve that initial response. If your pages are served up from some CMS (Content Management System) or blog engine (like my pages here are), then they are probably generated on the fly by inserting content into one or more templates. Can that process be improved? Maybe it's a case of a faster web server? No matter what, if the initial response takes 5 or more seconds to arrive, you are going to have real issues in producing a fast web page, perceived or not.
For me, on my Volvo blog, the page itself is being returned really quickly: waiting 97ms (purple bar) with an almost instantaneous receive time (too thin a green bar to see) isn’t too shabby at all. Nothing much to optimize there.
Then notice that nothing happens for a few milliseconds. What is actually happening in the background is that the browser is starting to parse the HTML returned. It’s going to be building up a DOM (Document Object Model) in memory representing the elements and their relationships in that HTML. Along the way it’s going to identify other files that have to be downloaded for the webpage and it will start those downloads.
The next thing to notice about this network display is that it shows a waterfall. Some file is downloaded, some other file is downloaded, and yet some other file is downloaded. The timeline progresses towards the right. For Firebug, a purple color in a file download timeline indicates the time spent waiting for a response, green indicates the receiving time. The grey at the beginning of a file timeline is the “blocking time”, the time during which the file has been identified to be downloaded, but the browser is too busy doing “other stuff” and so blocks initiating the file download. Hence, for each file download in Firebug, you’ll see a grey segment (which may be so thin as to be invisible), a purple segment, and a green segment (which for small files may be very thin again).
The thin blue vertical line represents an event when the DOM content has been loaded (essentially, the markup minus images, etc), the red one is when the page Load event is fired. In this case, the onload event is fired at 2.42s, which again isn’t too shabby. Room for improvement certainly, but it’s not the end of the world if nothing is done (the number of readers of this site is in the order of tens a day).
The next thing to observe is a little subtle. Observe the start of the purple “waiting” bars, ignoring the grey “blocking” segments. The second to seventh file downloads on the timeline all start waiting at the same time. The first is a webfont from Google, the other five are CSS files from the same domain as the webpage. If you look carefully, this same batching occurs elsewhere as well: files seems to be downloaded in batches of five or six. This batching size is browser-dependent, by the way; in my experience Firefox certainly seems to use six-at-a-time batches.
Another small point to notice here: jQuery is downloaded from the ajax.microsoft.com CDN (Content Delivery Network) and although it is a smaller file than shortcode.css, it’s taking longer to download (the length of the green bar). That’s possibly a hint I should host my own copy of jQuery rather than using the CDN.
async so that it doesn’t happen immediately. I’ve yet to see that in the websites I look at.) They have to: after all the script may want to change the DOM – add some elements, change some styles – before the rest of the markup is parsed.
But that can wait until the next post in this series. Until then have fun analyzing your webpages’ network traffic.