When building a site, I try to make pages as small as practicable. The aim is to reduce load times for visitors and so provide a better experience as they move around the site.
There are several reasons for this:
1. There is a correlation between the speed of the site and the perceived credibility of the company and the content.
2. Many New Zealanders are still on dial-up.
3. Bandwidth costs money, which is always scarce (bandwidth is sometimes scarce too).
4. Serving smaller pages puts less load on servers.
I have just made some changes to www.radionz.co.nz to improve page load times and reduce bandwidth.
The first of these changes was to stop using sIFR. Scalable Inman Flash Replacement (to quote the site), “…is meant to replace short passages of plain browser text with text rendered in your typeface of choice, regardless of whether or not your users have that font installed on their systems.”
The advantage of using sIFR over the traditional method of using images for headings is that if any text is changed, or new pages are added, the text is automatically replaced. We add dozens of new pages every day, so this is a big time-saver.
As the site has grown in size and the number of visitors increased, the 42 kb download and the slower rendering time started to annoy me. Even when content on a page didn’t change and was cached in the user’s browser, there was still a delay while the headings were replaced.
Lastly, the typeface did not have any macronised vowels, so it was not possible to correctly set headings in Māori.
So last week I removed sIFR from the site. It was a very tough call as the sIFR replaced fonts look really good, and added a certain polish to the site. But with any change you have to weigh all the pros and cons, and at this time the benefits to end-users where overwhelming. (There are also some other changes that I’m making in the near future that’ll be simpler without sIFR, but more about that later).
Upon removal, the page rendering improvement was immediately obvious on broadband, and I suspect that on dial-up it will be even more marked.
The other side-effects of this change are slightly reduced server loading (from fewer connections) and a reduction in the amount of bandwidth used by around 800 megabytes per day. (We shift about 8 gigabytes of page traffic a day. The audio is many, many times this figure).
There are a number of advantages in doing this, summarised nicely by Steve Souders. In a nutshell, the Google servers are optimised to minimise the size of the content, and content headers are set to encourage caching of the content at the ISP and browser. It works even better if other sites use the same library – it increases the likelhood that the content is already cached somewhere, spreading the benefits more widely.
I could have made these changes on our own server, but it doesn’t cost anything to support the initiative so why not? I don’t know how many other NZ sites use mootools, but a lot of the bigger sites use prototype and they could benefit from better site performance, lower bandwidth use, and improved user experience by adopting this approach.
The difference in rendering is quite noticeable, and on slower connection you can see the scripts continuing to download after the current page is showing.
In the case of the Radio New Zealand site we’ve reduced the rendering time for pages by 3 – 4 seconds and trimmed bandwidth consumption by about 10%. The changes took 3 hours to plan, test and implement.
At the rate we consume traffic, the payback period is pretty short. Add to that future savings from not having to replace the servers quite so soon, and the unmeasurable cost of delivering pages faster to visitors (who don’t use as much of their data caps) , I’d say it was time well spent.
One thought on “Improving page load speeds”
And another reason – Google’s AdWords (and apparently page rank) algorithms take page load time into account:>>< HREF="http://www.seroundtable.com/archives/017093.html" REL="nofollow">See article here<> >>ie: Slow sites are penalised!