How we made Portent.com really freaking fast
Please note: I am not a server expert. This is stuff I figured out by bumbling around, crashing our server and generally wreaking havoc. Be careful, unless you want to see steam coming out of your IT team’s ears.
I don’t want to shock anyone, but I’m a little obsessive. And competitive.
No, really, it’s true. I hide it well, I know.
I’ve always obsessed about page performance. Faster pages convert better. They can rank a little bit better. They reduce bounce rates, which can help with PPC quality score. Plus, it just gives folks a much better user experience. Which sounds all squishy and liberal, like me, but separates the decent sites from the oh-my-god-this-site-rocks sites.
So I’d set a goal for Portent.com’s home page: Make it load in 1 second or less. While we aren’t quite there, we’re close. Here’s the Pingdom report, using all of their check locations (including Europe):
Niiiiice. Even at its slowest, we’re under 2 seconds. That makes me smile. Here’s what my team did, with me micro-managing and driving them basically insane the entire way:
The basics: Images, compression, and code optimization
Site performance is like cycling performance: If you get the fancy stuff and shave your legs, but are in crappy shape, you’ll still ride like, well, me. If you put your site on a content distribution network and invest in speedy servers, but your images are bloated and your code sucks, your site will still drag. So, here’s what we did first:
- Compressed images. I nearly blew a gasket when I saw we had three-color images that were JPGs (gasp). But, I was calm. I pointed my team to their own image compression post. This reduced bytes transferred by 20%. I walked away, muttering.
- Specified image dimensions for all images. This makes the perceived pageload time a bit faster, as your browser doesn’t have to figure out image sizes.
A secret: I used Google Page Speed to come up with all of the changes above. You should, too.
All of this got us down to a 3-4 second page load time.
Was I satisfied? Hell no.
Getting fancier: Expires headers
Getting from 6 to 4 seconds was all we could squeeze out of the basics. Time to get fancier:
We set the ‘expires’ header for some files to a year from now. When you visit a web site, your browser stores or caches lots of files on your hard drive: Stylesheets, images, scripts, etc. That speeds page load time the next time you visit. The server sends those files to your browser with an expires header, which tells your browser when it should re-load them. Our server defaulted to 24 hours for most files and file types.
Not good. Silly, in fact. How often do we change our logo? Or our page background? Once every 1-3 years.
We reconfigured our server to deliver most ‘static’ files with an expires header 1 year in the future.
That shaved another .5 seconds or so off our page loads. We think. We were making a lot of changes at once, so it was hard to tell for certain. But it definitely helped.
No, this isn’t about killing people who visit our site. It’s about how our web site connects to your browser.
In non-geek, the ‘Keep-Alive’ setting tells the server to hold the connection to your web browser open while you’re browsing our site. That reduces the number of times the server has to open new connections, which saves processor, memory and network overhead.
Time saved? .2 seconds. Your results may vary. When I first turned this on, I set the directive in the configuration file to “Conneption: Keep Alive.” Apache began laughing hysterically at my typing skills, and promptly crashed. Lesson: Computers are literal, but they do have a sense of humor. Check your configuration files carefully.
Dump annoying scripts
We use WordPress. Which is great, 90% of the time. But there are some annoying scripts, like wp-cron.php, that fire every time the server loads a page.
Firing wp-cron every time you load a page is a bit like giving every airline passenger the safety speech as they get on the plane, instead of giving it to everyone all at once. Inefficient. And dumb. We changed it.
Even better, Googlebot somehow found admin-ajax.php and had a grand time crawling it every 30 seconds. That gave our server a huge headache.
We added that to robots.txt. Google stopped crawling it. Life is good again.
Setting up a CDN
We were still seeing slowdowns because of images and other files that transferred sloowwwwlllly for us, and more slowly for folks around the world. The answer: Get our site using a Content Distribution Network, or CDN.
A CDN spreads your files around the internet on special servers. When folks visit your site, they access those files from servers, locations and networks optimized to best deliver them. The result is twofold:
- Files arrive faster, because they’re cached and delivered by zippier servers that are (sometimes) closer to the user.
- Transfer size shrinks, because content delivered via the CDN is ‘cookieless’. There’s no extra information attached: Just the bits, ma’am.
We tried a few networks. After I nearly hurled my desk to New Jersey in a fit of customer ‘support’ inspired rage, we switched to MaxCDN. They’ve been fantastic for us ever since.
Now our site was getting speedy: Average load time was a hair over 2 seconds. The team probably figured I was done, and they could go back to doing real work. Nooooooo, I had more.
Latency and APC
Our site uses PHP, and we continuously saw high latency. If the ideal server to browser conversation is like this:
BROWSER: Hi server.
SERVER: YO! What’s up?
BROWSER: Can I get the home page for Portent.com?
SERVER: Yep, no problemo, here it is.
Then high latency makes it seem more like this:
BROWSER: Hi server.
SERVER: zzzzz SNORT cough hey, sorry. What can I do for you?
BROWSER: Can I get the home page for Portent.com?
SERVER: Sure, just a sec, I have it around here somewhere…
SERVER: OK, here you go.
Lots of things cause latency. In our case, the biggest bottleneck was PHP. Every browser visit required the server to execute a bunch of PHP scripts. Which was silly, since most of those scripts did the exact same thing 9 times out of 10.
So, we set up something called Alternative PHP Cache, or APC. APC lets the server cache the results of PHP execution.
I let a real developer handle APC setup, thereby avoiding the whole crash-the-server thing. It went far smoother. The result was like that rocket afterburner thingy on the back of the Batmobile.
Page load times fell to .8-1.5 seconds. I did a happy dance.
I’m on a quest, though, and we’re not there yet: I want 80% of the pages on Portent.com loading in 2 seconds or less. And we’re nowhere close:
That report is from Google Analytics, by the way.
- Try optimizing our initial congestion window. By boosting it, we can increase the amount of data delivered to browsers when they first connect to our server.
- Find a better way to deliver web fonts. Web fonts are our biggest caching issue right now. Our CDN caused all sorts of problems with them. There are CDNs set up just to deliver web fonts, and we tried them, but they just slowed us down even more. We’re still pondering this one.
- Sweep through 1000+ blog posts and re-optimize images. Any volunteers?…
I hope this helps some folks speed up their sites. If you see ideas I missed, let me know, too.