Well, I took a quick look at my Page Speed results today, and it looks like Google is angry at me for my inability to leverage browser caching. Essentially, I’m reaching out to the server for every image on my site, even if the user has loaded it already. Seems like a ton of extra unnecessary work eh?

Well, htaccess to the rescue! I added the following to my .htaccess file, which sped up my site considerably (and also upped my Page Speed score). I opted to set my headers to a long time for both images, css, and js for now, since I don’t really update the look of my site that often. When I do decide to, I’ll most likely rename my js and css files, and removed the old ones.

htaccess

# Setting Cache Control Headers
# 480 weeks
<filesMatch ".(ico|pdf|flv|jpg|jpeg|png|gif|js|css|swf)$">
Header set Cache-Control "max-age=290304000, public"
</filesMatch>
 
# 2 DAYS
<filesMatch ".(xml|txt)$">
Header set Cache-Control "max-age=172800, public, must-revalidate"
</filesMatch>
 
# 2 HOURS
<filesMatch ".(html|htm)$">
Header set Cache-Control "max-age=7200, must-revalidate"
</filesMatch>

Anyone out there have any suggestions on ways to set cache control? This was my first attempt at it, and I’d love some feedback!

Edits

While this is a decent way of setting this up out of the box, I’ve switched to using the W3 Total Cache plugin for browser cache control, as well as many other speed improvements.