We’re all seeking great performance on our websites. Less requests and quicker load times are a must in the mobile era. I just did a refresh on my site. Through some simple cache and gzip tweaks, I was able to cut load time in half.
I decided to use two separate speed tests for this, WebPagetest and Auditing in Chrome. Let’s check them out:
Here were the recommendations of the WebPagetest. If you check out the link, you can see that I’m getting a pretty good waterfall (aside from my huge js file; we’ll get to that later). My main problems seem to be gzipping during file transfer, and caching.
Ok, not too bad. Let’s try the Chrome audit:
Looks to be mostly the same callouts, with a few small additions. It’s asking that I leverage proxy caching, which we will take care of with the cache fixes from above. Now that we can see the problems, let’s get on with the solutions
HTML5 Boilerplate and .htaccess
The two main fixes we are going to do here are Apache configurations. Well, I’m not exactly an infrastructure guru, but luckily the great folks working on HTML5 Boilerplate are! We are going to grab a few pieces out of their .htaccess file, and fix our low ratings from earlier.
At it’s simplest, the .htaccess file is a hidden file at the root of your website, which houses web server configuration. You can read more about it here. One other note: If you have access, and are comfortable with SSH, you can also make these changes in the main server config file (usually called httpd.conf). This has the added benefit of being a bit faster for the web server. For this exercise however, we’ll be using htaccess.
As noted in the Chrome audit, we can reduce file transfer size by almost two-thirds by compressing them with Gzip. Essentially, my understanding of the code above is that we are checking for gzip support, and if we have it, we will run it during the transfer of any files with the MIME-types specified (hence the long list).
A couple callouts: I’d check first on your version of Apache. My version is below 2.3.7, so I was able to remove the check for mod_filter.c. Secondly, as noted in the code above, there is a pretty sweet fix for server proxies and such that block Gzipping. You can read more here.
This code will also live in .htaccess. It tells the server when to ping back for a file, and when to grab it from the browser cache, which is extremely quicker.
In the code above, you’ll notice that like the Gzipping, we are setting these rules based on MIME-type. Feel free to add/remove as you need to. The headers are set based on access plus cache time. This means cache time will kick off once the user hits that file. If cache is set to an hour, cache will work like so:
- 10am: User hits file: Latest updated file from the server.
- 10:12am: User hits file again: Served from cached version.
- 11:01am: User hits file again: Latest file is again pulled from server.
In the instance above, the request served from cache has benefits and downsides. The benefit is that grabbing from cache is crazy fast, as the browser has the data stored locally, and does not need to ping the remote server. The downside however, is if any changes are made to that file, the user will not see them until 11am, when the cache resets and they again ping the server.
There are ways to bust cache when you make file changes, which gives you both the caching benefits, and the ability to make quick updates to your site. I am doing this with a combination of a Grunt task, and filename-based cache busting, which I will cover in my next article.
Now for the fun part; the fruits of our labor! As you can see, we are getting some much better results from our Webpagetest. The difference between first view, and repeat views is astonishing:
- First: Document complete time: 1.498s, Requests: 17
- Repeat: Document complete time: 0.556s, Requests: 2
On repeat views, we are saving a ton of time due to the 15 requests we are caching. This test was also done on a DSL connection using Chrome. Where you really see a speed win is over 3g.
If you check out the full Webpagetest, you may notice we are still getting a “B” on compressing transfers. This is because I am not Gzipping images in my gzip config. This is a personal preference of mine, based on Yahoo’s Best Practices for Speeding Up Your Website:
Image and PDF files should not be gzipped because they are already compressed. Trying to gzip them not only wastes CPU but can potentially increase file sizes.
If you have any questions on this stuff, add them in the comments or tweet me!
Next up: Busting Cache with Grunt.
Leave a Reply