Pages So Fast Your Customers Will Freak Part 1: The Server

The average website's total size has doubled in the last 2 years:

"Up and to the right graph of pagesize

All this 'stuff' takes time to download, and users have been shown to dislike slow load times for webpages. Additionally, Google has started using page load times in their calculation of PageRank, meaning a slow site has more objective drawbacks now than ever before.

Luckily Google has provided the internet with their excellent PageSpeed Insights tool. It shows you all of the current networking and application best practices that can be used to shrink your site's download time.

Over a series of posts, I'll cover each category of recommendations made by the Big G, starting here with the server components that Google has visibility into. (This only sort-of-includes application code, as Google has no idea how many bad N + 1 queries you're doing with ActiveRecord)

Part 1 – The Server

Avoid landing page redirects

Every request -> response cycle that a browser makes is an order of magnitude slower than everything else it might have a responsibility for. Every 3xx response from your server takes appreciable time before you even get to the business of delivering a real page.

This is mostly a controllable issue when you bifurcate your traffic into mobile and desktop segments. Google only measures your homepage, as that is usually an outsized amount of your traffic.

Your best case scenario is to have a responsive website, so every user can consume the same url at A close second is a single 302 to for mobile user agents.

An example of a 'too slow for Google' scenario would be along the lines of -> -> ->

Not ideal, usually easy to fix. In this case, you'd put some logic into your first redirect to skip right to the end of the chain.

Enable compression

Every major server, and every major browser supports gzip as a compression format.

Text files like CSS and HTML are ideal candidates for gzip's dictionary-based encoding due to their repetitive nature. You generally can expect to get about an 80% reduction in over-the-wire size, and decompression time is minimal.

Improve server response time

This is a black box to everyone but you. Google will trigger an alert if it takes more than 200ms to get a response from the infinite void of your application code. "Application Code" means everything that happens inside your app, starting with receiving a request, then routing it, database lookups (hint: often a slow culprit), HTML rendering, and sending the HTTP response back.

Google can't help you here beyond raising a red flag, but most frameworks have profiling tools available. For Rails work we usually end up using bullet,, and/or New Relic.

(Hint again: It's always the database.)

Leverage browser caching

A browser can keep around copies of assets it has previously downloaded if a server tells it to. There are some complex implementation details surrounding it, but the basic premise is clearly explained here.

Luckily, Rails, and most other frameworks have excellent and granular cache control built into the framework, letting you directly manipulate expiration dates, ETags, etc.

Between Cache-Control headers ("Browser, keep this around") and ETags ("Server, do I really need a new copy?") your application can avoid wasted round trips for files that it already has.


Before you even write a line of code you can take steps towards keeping your app snappy and your users happy. Try out PageSpeed Insights on your own site and see how you rank.

This is part 1 of 3 of the "Pages So Fast Your Customers Will Freak" series

Part 1: The Server Part 2: The Assets Part 3: The Rendering