Research: Site Speed Is Hurting Everyone's Revenue

Ian Lurie

Update! Learn the ins and outs of a faster site in our Ultimate Guide to Site Speed

Site speed, site speed, site speed. Everyone around me is sick of hearing me rant about it, probably because I’ve pushed it on every client Portent’s had since, oh, 2008.

Well, tough poo, ’cause I’m doing it again. Portent’s analytics genius, Michael Weigand, and his faithful right-hand man, Timothy Gillman, did some math based on e-commerce data from 16 sites. I then tested 500 e-commerce sites using YSlow for basic performance data.

The 16-site study included conversion data and is what we used to show e-commerce impact. The 500-site study was performed to show the most common problems.

What We Found

Here’s the breakdown of our results, plus how you can capitalize:

Still slow after all these years.

In spite of a small riot of experts who agree that faster is better, most sites are still slow. In our study, 50% of sites average load times of 5+ seconds. That’s with a standard deviation of 8 seconds (yikes) so we trimmed outliers and still got 5 seconds with a deviation of .5s.

At least get from 8 seconds to 5

The biggest and easiest value/page view jump is from 8 second to a 5 second load time. While the biggest revenue jump is between 2 seconds and 1, going from 8 to 5 seconds is easier and generates an 18% value/page view increase.

Page load time: Go from 8 to 5
Page load time: Go from 8 to 5

Page load time: Go from 8 to 5

But every second you shave off your site’s average page load time (without shedding page views) means an 8% improvement in page value, so keep going.

Some pages matter more than others

Hardly news, but faster checkout, login and home pages matter most. After that, load speed product category pages most impact sales. All of these pages hog high-consumer-intent traffic. Make them fast.

Page weight is not the only load time factor

Page ‘weight’ (that’s the total kilobytes transferred, images and all) is no longer the biggest factor in site load time. In our test, pages of over 4 megabytes recorded some of the fastest load times.

The reason: Many sites have streamlined their code, learned to ‘minify’ their code and to use GZIP compression. So the bigger factor, in many cases, is the server and page configuration.

If you want to speed up your site, look at:

Javascript timing

Where possible, put javascript src ‘.js’ includes at the very end of your page, defer them, or load them asynchronously. For example, if you use Google Analytics or jQuery, you’re including these scripts in your page from external files. These are ‘blocking’ calls, and your page may not appear to the user until these scripts fully load. If something goes wrong, or the script just takes a few seconds, the perceived delay can kill conversion rates.

Use deferred execution if you need to fire the javascript after the entire page loads.

Use asynchronous execution if you don’t care when the script fires.

For a great overview of deferred and asynchronous loading, see Peter Bevaloo’s explanation.

ETags and Expires headers

The top page speed issue we found was ETags and expires headers. These may sound scary, but expires headers in particular are easy to set. Your IT person or web host will know what to do.

These two settings help reduce the number of requests a browser makes to the server. They tell visiting web browsers which files to update. If set correctly, you can prevent those browsers from reloading files that rarely change, such as your logo.

Browsers will cache files with far-future expires headers and keep it cached, so the apparent load time is far shorter. And they’ll use ETags to more easily check if a file has changed.

But images are still important

Page weight may not matter as much. But image size is still a major drag on load times. Compress them, for the love of all that’s good and right in the universe. OK? Here’s one way to do it:

Run Google Page Speed for Chrome on your site. If ‘optimize images’ has an “M” or an “H”, view the Optimize Images report:

Compress Images = Happiness

Compress Images = Happiness

If the original content is a ‘png’ image, you can replace it in one step. Click ‘See Optimized Content.’

Newly compressed image: Easy-peasy

Newly compressed image: Easy-peasy

Download the compressed images and then upload them to your server. You’ll overwrite the old, overweight image files with the new, svelte ones.

For a primer on image types and compression, see RJ’s excellent article.

Then What?

After that, things get a little more complicated. But not much. Any competent web developer or IT person can get you set up with GZIP compression, for example, or troubleshoot slow database load times. If you want to go to plaid, you can get fancier.

Research Method

For value/page view data, the team looked at just over 94 million page views across 16 e-commerce sites. The critical statistics:

  • We took a 90-day snapshot
  • Sites ranged from 61,000 to 38 million page views over the 90-day period
  • 3 of the sites were B2B e-commerce. The rest were B2C
  • The sites ranged from major national fashion brands to small, niche manufacturers

For site performance data, we ran YSlow and Google PageSpeed (using the API) on 500 sites. We tested the home page and one category page on each site. Then, we did a correlation analysis, first removing outliers.

Our data sampling is strongest between 8 and 5
Data sampling

Our data sampling is strongest between 8 and 5

Potential Problems

Biggest, scariest issue with our data: There’s a weird drop in page value between 3 and 2 seconds. It’s about 9 cents. We’re looking into that, and will try to get a bigger sample to smooth the outliers

And then there’s the fact that Ian analyzed the data. I’m very aware of the Texas Sharpshooter Fallacy, and I went into this with some uh, opinions, plus a statistics background that starts with a History degree and ends in law school. Keep that in mind. However, we had the best sampling in the time increments where we’re staking our claims:

That could be good, or bad. I leave it to you to judge.

Easy Wins, Big Advantage

Compared to other digital marketing challenges, page speed is easy. And it has measurable results. But very few companies do the simple things that make a site fast. Bad news for them. Good news for you — you can gain a big competitive advantage if you just do the basics.

Ian Lurie
CEO & Founder

Ian Lurie is CEO and founder of Portent and the EVP of Marketing Services at Clearlink. He's been a digital marketer since the days of AOL and Compuserve (25 years, if you're counting). He's recorded training for, writes regularly for the Portent Blog and has been published on AllThingsD, Smashing Magazine, and TechCrunch. Ian speaks at conferences around the world, including SearchLove, MozCon, Seattle Interactive Conference and ad:Tech. He has published several books about business and marketing: One Trick Ponies Get Shot, available on Kindle, The Web Marketing All-In-One Desk Reference for Dummies, and Conversation Marketing. Follow him on Twitter at portentint, and on LinkedIn at

Start call to action

See how Portent can help you own your piece of the web.

End call to action


  1. Ian,
    Amen! I changed hosts, spend a bit more each month and had my developer speed up my site. Night and day difference, both the experience visiting my blog and the traffic spike I’ve experience since. People dig quick sites and rarely wait around for slow sites. Most of us are this way but are blind when we see our own blogs. I was for years 😉

  2. Hello Ian,
    First, a technical tip: I see “Your comment is awaiting moderation.” for Ryan’s comment above which is, to say the least, weird.
    Now – to the study.
    I am confused: you say you analyzed data from 16 e-commerce sites but then examined 500 sites with Yslow. Then you present page value data. I don’t quite get how you measured page value for those 500 sites based on the 16 sites. Or maybe I’m not getting the setup at all?
    Another question – what is the statistical significance and the power for the difference between the <=1 second bucket and the 1-2 second bucket? You have not published any sample sizes so I can't calculate those.
    Also – have you checked for compounding variables between the buckets? Like maybe the niche of the site plays a significant role or some other unaccounted but important segmentation line?

    1. Hi Georgi-
      The 500-site study was a separate review. The 16-site study included conversion data and is what we used to show e-commerce impact. The 500-site study was performed to show the most common problems. I’m going to add that to the post.
      As far as niche and other potential compounding variables: There’s nothing obvious, but it is why I mention wanting to do this study again with many more sites. That requires access to data from many companies, though, and it’s unlikely most will cooperate.

  3. Thanks Ian! I don’t blame you for pushing site speed so much. We do the same thing. If nothing else, it is the one thing that we work on during our time with a client. Keep up all the solid work my friend!

  4. Hi Ian,
    great post.
    I am 100% with you on this and spend way too much time in my peers opinion’s focusing on hosting reliability and speed. I am the only person I know that really truly loves DNS.
    I also think that people in search should care a lot more about their clients web speed and in my opinion the best way to monitor that is using RUM real user measurement which you can actually use on very low traffic sites for free via Pingdom.
    That way you can fix issues that I simply using a tool to collect the speed from one data center (obviously there is huge value those tools as well in the beginning) but I do not understand why everyone in search does not bring over their customers to a specialized hosting environment for their site.
    Along with using RUM data to make sure that the clients customers are really happy. No matter where they are.
    We do not allow clients to stay on Go Daddy and pay for the correct listing for their site ourselves because it is so important. We also think it is important for SEO’s to have the ability to work with the hosting provider as well as the client maintaining ownership of their account if they wish to & always give the client a daily backup of their website regardless.
    It has gotten to the point where we now have a complete set up that is redundant and really fast after using literally over 60 hosting providers. I would be happy to share all my testing data with you if you like it?
    I have narrowed it down to a handful of quality hosts that dependent on my clients needs I will pay for myself if the client has an issue with the bill.
    I have also found that using the EdgeCast ADN (not a typo) application delivery network. Is an incredible tool.
    By adding Google page speed to every site via the ADN than using DynECT DNS mixed with EdgeCasts new route DNS.
    We can actually run and Nginx + varnish with a site that will benefit from Google page speed. As I am sure you know most have to choose between just using Nginx or Apache.
    In order to use Google page speed.
    I am really happy to see companies like Digital Ocean sprouting up because it gives people that do not have the budget the ability to run a much faster site than they could normally for five dollars. However while I endorse digital ocean for many uses rarely and I mean very rarely do I use them for my customers.
    I think we have got to think about the hosting and coding and how it affects end-user right that is the name of the game.
    End-users hate downtime, love sites that are really fast sites on mobile and desktop.
    Simply by removing clients websites from hosting companies like Go Daddy and moving them onto our Firehost set up with Google page speed and varnish then of course adding the correct DNS DynECT or EdgeCast DNS including geo-IP ( DNS made easy is cheap and fast as well)
    We have brought so many slow websites with a plus 10 seconds to an average RUM speed of under 2-3 seconds prior to making any code changes which is something that is just as important.
    I agree with you it is in no way cheap to do this, but it is worth it every time.
    I am really looking forward to more and more hosting companies becoming more aware and networks getting faster.
    Sorry for the very long message,

    1. No problem re: the long message. What a great comment!
      I agree: I think clients are also getting more savvy about this and seeking out faster providers. With virtualization, having your own dedicated box is easier.

  5. I would love to read an article on how these issues may or may not conflict with having a more templated web site service, such as a wordpress site.
    I’m curious what challenges one might have trying to speed up a wordpress site, and the pros and cons to having a wordpress site when it comes to page speed.

    1. Hi Norm,
      WordPress sites are just like any other. They can be fast or slow. I do find WordPress a bit easier to optimize, simply because there are plugins that’ll do it for me, like W3TotalCache. Our site runs on WordPress.
      Hope this helps,

  6. Hi Ian,
    I love the post. I too feel like I beat the same drum every day. I specialize in making Magento eCommerce sites faster by providing my customers with a very optimized hosting platform. I’ve written several case studies on site speed and I actively seek data to increase site speed and performance on a daily basis. I guess you could call me a speed geek. I don’t understand why some site owners / developers do feel the same way about their site speed and how it affects their bottom line, yet they are hesitant about taking action to address their concern.
    An interesting observation with the case studies I’ve performed:
    1. Most developers / site owners realize that they can increase revenue by simply speeding up their page load times.
    2. They run a series of speed tests through Google Insights or Webpagespeedtest and gather data.
    3. They obtain detailed reports that show them how they can improve their page load times.
    4. They focus their attention very heavily on coding / design while sometimes overlooking the performance of their server.
    5. Those that understand their server plays a significant role in site speed start researching hosting platforms.
    6. But they are hesitant to make a change with their hosting platform because they believe it is a monumental task, or they just don’t know who to trust.
    I took all of these variables into consideration while searching for an easy solution to take away the pain, so I created a live Magento hosting comparison tool. I have given these concerned individuals the opportunity to compare their current Magento site to a live mirror copy of itself on an optimized platform for free, just to take the fear factor away and show them what can happen when they get a tailored hosting plan that complements their specific site.
    What I don’t understand is that people are still hesitant to try the free test, but those who do, get rewarded with the benefit of having a faster site. Ultimately, that means they make more money.

    1. Great points, Tom. I run into this all the time, with all changes: Hosting, database configuration, image compression, etc. But, that just means a bigger advantage for everyone else, I guess.

  7. Great article Ian!
    Harping on page processing speed is a common aspect of most of my audits as well.
    In addition to the areas you covered, there’s 3rd party domain processing.
    GoogleAdServices code embedded on a site to help track AdWords visit data is notorious – I’ve had clients deactivate that code alone to gain upwards of three seconds processing speed. Disqus is ridiculously flawed code. Social sharing live-count scripts, useless bell and whistle widgets, multiple ad-network conflicts…
    Another common culprit is First Byte Time – issues with a site communicating with its own server or its data server…
    So many ways to shave off precious seconds. And while I encourage clients to shoot for an ideal speed, I also let them know that (as you describe in the article), progress is often enough to make serious gains.
    They typically need to balance limited resources in the implementation phase so I don’t demand that they achieve the ideal speeds (at least in the initial phase after an audit)…

    1. All excellent points. Javascript drives me nuts, when you could just scoot it to the bottom of the code and reduce load time by quite a bit.

  8. Site speed is something that I definitely feel is overlooked by a lot of sites.
    One service that I feel isn’t mentioned enough for this is Cloudflare. Especially for smaller sites it offers a free solution that in addition to caching your resources on their servers for faster response times will also autominify your css, ensure that your javascript files are load asynchronously and I believe a few other things.
    I managed to get my company’s site from 4 seconds to ~1.3 seconds using mostly just cloudflare, some image optimization and setting expires headers in my .htaccess file.

  9. Interesting article. I actually had the chance to spend some time playing around with Google’s mod_pagespeed over the weekend to automatically perform a lot of the same optimisations you covered in this article.
    I wasn’t entirely sure what to expect since the site was already well optimised and fairly lean but I ended up making massive gains in less than 10 mins of work. I ended up writing about how I did it in case you or anyone else is interested.

  10. Hi Ian,
    I too am a big fan of fast loading pages.
    To demonstrate, our website loads the home page in 466ms, that is from server with the caches empty. Of course this depends upon the network speed.
    I mention the time, not to show off, but to show that it is easily possible, if and this is really important that the whole idea of reducing load time is considered from the beginning of the process.
    Once the low hanging fruit have been taken care of, such as the amount of data transfer, as identified in your article, it is down to efficient use of server resources:
    * Server side caching of code
    * Database connection time
    * Database access time
    Etc. This is where the large systems fail, in my experience, the back end is designed for flexibility but at the cost of performance.
    Like so much in life, there isn’t one simple answer, there are several answers to different bits of the puzzle.

  11. Wonderful article Ian!
    I must agree with Spencer about the CloudFlare as well. It basically takes very little time to configure and we gained about 30% increase in revenues on our demo sites when I took a week and went over our server configuration, installed W3 total cache, mod_pagespeed, optimized images etc.
    We are selling WP themes and I am just now running Google Experiment a/b testing how the page load speed is performing the sales. I will write a blogpost about this when the test is finished on my blog.

  12. I wish I could say “tough poo” on the blog I contribute to, oh well…
    Excellent article. I found that the specs of the server hardware also have a very big influence on website speed. Find a more powerful solution and your website will speed up noticeably.

Comments are closed.

Close search overlay