Ever since Google announced that every mobile web site had to achieve less than one second loading times, Iâve been meaning to do a fun, psuedo-scientific study to help start the discussion of putting this well intentioned goal into a perspective that everyday developers and businesses can understand. This topic has been heavily promoted, with most of the blog posts simply trumpeting agreement and either implying or coming right and saying that any page that loads over one second is pretty much worthless. In case you’ve never read the actual article, here’s a qoute from Google:
“we must deliver and render the above the fold (ATF) content in under one second, which allows the user to begin interacting with the page as soon as possible.”
I propose that most consumers will put up with a majority of sites that donât deliver in exactly one second as long as the following general criteria are met: the user finds what they want or need the majority of the time, the user interface is fairly easy-to-use and intuitive, that there are limited or no overbearing advertisement banners (whoops!), and that the page loads within a reasonable amount of time. I really havenât done enough research to know what a âreasonable amount of timeâ is, but I know from testing that very few sites today deliver true sub-second performance. So letâs tear into this a bit.
First donât get me wrong, I think sub-second performance is a very worthy goal and I absolutely think every web developer worth his or her salt should strive as hard as they can to make their web page performance as fast as possible. Reality is that budgets can be limited, time frames for deliverables can be short and not everyone is a website performance expert. HoweverâŚI think we need to start asking some hard questions to make sense of the one-second rule and understand how it can be applied in our everyday development work rather than simply taking it at face value, for example:
- What industries in particular does the rule apply? I suspect itâs mostly applied to the online retail industry. I think web site visitors cut other industries a reasonable amount of slack.
- Does the rule only apply to first time visitors?
- Do repeat visitors abide by other page speed rules? Repeat visitors can take advantage of cached browser pages to speed up their viewing experience
- How do lousy mobile internet connections factor into this equation? For example, if someone knows they have a lousy internet connection most of the time do they factor that into their online buying decisions?
- Does the rule apply to all countries or just the U.S.?
- What is the ideal internet connection speed that this rule is based on? It seems unlikely that a page would have to load under one second regardless of the connection speed.
- Does this apply to only self-hosted websites? What if your website is hosted on Amazon Webstore or Etsy because with these sites you don’t really have any control over how their webservers, DNS, cloud or internet pipe are configured.
I then went about verifying who are the largest online retailers in the U.S. by sheer sales volume and I came up with Amazon, Staples, Apple and Walmart as good candidates for the top four. However you verify this list, we can all agree that these four sites generate a massive amount of internet traffic, billions of dollars in revenue per year and perhaps even a majority share of internet sales. Given the fact that these stores are were tens of millions of people successfully shop every day I wanted to use the seemingly undisputable shopping experience of these retailers as a basis for comparison.
It seems like a fair assumption that these retailers must be doing something right, and therefore whatever they are doing could be a potential guideline for others. I theorize peopleâs online shopping and surfing expectations are formed by the websites on which they shop the most. You tend to do in-store shopping at places where you are comfortable and that the same can be said for on-line shopping. Therefore, we need to understand these leading retailers performance baselines to get some basic numbers that we can compare our against our own websites performance.
For my device, I used my middle of the pack Nexus 4 on a DSL WiFi to ensure the best possible consistent connection. Where I live 4G speeds can fluctuate quite a bit during the day, so in order to normalize those issues out of the tests I simply went with WiFi:
Android Nexus 4, Android v4.4.2
Native Chrome browser
12 Mb/sec DSL/WiFi via G Band (verified between 10 â 12 Mb/sec) – Your own WiFi experience will vary significantly.
To measure performance, I used the latest desktop version of Chrome Canary and it’s new mobile inspection tools that were hooked up to my phone via USB cable. This works really, really well, by the way.
Here is what I was looking for. Your test results will vary based on your device, other applications running and internet connection speeds. I didnât test iTunes because, well, I donât use iTunes on my Android and itâs not a website. Believe it or not, when I went to Apple.com on my Android I got a desktop website and not a mobile website.
I chose the following criteria to put context around the very first page load, since thatâs what Google seems to focus on the most. My goal was to load each page two times. First time is with an empty browser cache and second time is with the website cached in the browser. Then I repeat the tests multiple times to help account for any anomalies.
Hereâs the criteria I lookedÂ at:
- Page lag with the browser cache empty. This represents a first time page visitor. By the way, I’m making a distinction between the technical time that Chrome Devtools reports the page has loaded and when the various parts and pieces within a web page finish spinning up first. This may result in a slight delay until you can actually start navigating around. This is a very subject number and it’s really hard to eyeball it accurately, but we’ve all experienced it. A web page can ‘appear’ to look like it has loaded however when you go to scroll the page down nothing may happen for a short period of time. I fully acknowledge that some of my perceived delays are due to the lag time looking back and forth between a timer and the web page which were right next to each other.
- Page lag with web page cached. I report both the technical page load time and the perceived page load time. This represents a repeat visitor.
- Total download time with cache. Lazily loaded content can also drag down page performance for repeat visitors.
- And last but not least, Googleâs PageSpeed Insights online tool gives a few guidelines for trying to examine how well a website page stacks up against specific criteria. My only sticking point is it’s not 100% clear what criteria is being used. But, I will point out that not a single top four website got excellent ratings in the âspeedâ category. In fact, if we were giving out grades, two of them were in the âCâ category and the other two were in the âFâ category.
Amazon.com Tests (Averaged)
- Page lag no cache – 1.18 seconds reported, however based on my perception it looked more like between 2 and 3 seconds as the page finished visually loading.
- Page lag cached – 1.53 seconds according to Chrome dev tools. Strangely, the cached page tests seemed just a hair slower than the non-cached page. I’ve noticed that browsers can sometimes be a bit slow when grabbing cached files. It would take more research to dig into how they construct their page and what cached settings are used.
- Total download time (no cache) 36.52 seconds, 819KB, 103 requests (Yes, that’s right…around 35 – 36 seconds for a full and complete page load)
- Total download time (cached) 4.59 seconds, 428KB, 78 requests
- PageSpeed Insights
- Speed â 71/100
- User Experience â 99/100
Walmart.com Tests (Averaged)
- Page lag no cache – 415ms reported, based on my perception it looked more like between 1 and 3 seconds as the page finished visually loading. There was a somewhat brief spinner icon that displayed as the page loaded. Fast!
- Page lag cached – 378ms actual, based on my perception it looked more like between 1 and 2 seconds. I was able to start scrollable immediately.
- Total download time (no cache) 10.32 seconds, 358KB, 41 requests
- Total download time (cached) 9.61 seconds, 47KB, 30 requests
- PageSpeed Insights
- Speed â 77/100 (This number surprised me, but again we don’t know how the number was calculated)
- User Experience â 99/100
Staples.com Tests (Averaged)
- Page lag no cache – 5.52 seconds reported, and that approximately matched what I could see.
- Page lag cached – 4.45 actual and that also matched what I could see.
- Total download time (no cache) 8.25 seconds, 572KB, 44 requests
- Total download time (cached) 7.04 seconds, 25KB, 39 requests. Wow, 7 seconds for 25KBs??
- PageSpeed Insights
- Speed â 50/100 (Yikes!)
- User Experience â 96/100
Apple.com Tests (Averaged)
Apple gets the worse grade of the group because when I surfed to www.apple.com I got a full blown desktop website instead of a mobile-enabled website. PageSpeed Insights apparently agreed with me.
- Page lag no cache – 2.41 seconds reported, and my eyeballing it said between 2 and 4 seconds.
- Page lag cached – 1.69 seconds according to Chrome dev tools. My eyeballing it tended to look like around 2 seconds.
- Total download time (no cache) 3.66 seconds, 2.8MBs, 72 requests.
- Total download time (cached) 2.46 seconds, 905KB, 71 requests.
- PageSpeed Insights
- Speed â 58/100 (Yikes!)
- User Experience â 60/100 (Double Yikes!)
Since the vast majority of internet users buy products from these major retails, I believe their overall perceptions of how a web site should perform is in a great part established by their experiences while buying products online from them. None of the top sites were perfect, and there is always room for continued improvements.
Only one website out of the top four Internet retailers delivered a technical page load speed that was under one second: Walmart.com. Amazon came really, really close. Staples had mediocre mobile performance. Apple didnât offer my Android phone a mobile-enabled website.
There is a difference between the times when the page is loaded in the browser as reported by the developer tools and when all web page components become completely visible and then a short time later, ranging from several hundred to several thousand milliseconds, fully usable. As a mobile web developer I can tell you it takes a bit of time for a mobile application to be 100% ready. Many (most?) of us have experienced the often herky-jerky surfing experience as a web page bounces up and down while the content is still loading in the background. iPads have this nasty habit if you aren’t patient enough to wait and wait until the page “appears” to have finished loading. Because of this experience, defining technically at what point the page becomes fully usable can be fairly subjective. This is especially true because some retails treat tablets like desktop machines and deliver a full blown version of a website. Testing for when the page becomes visible and usable is very dependent on the phone’s capabilities, any other applications that might be using the phones hardware and bandwidth resources, as well as the internet connection at that point in time and the users perceptions!
Repeat page visits almost always load faster. Web developers would already know this, but itâs important to keep in mind for making sense of page performance discussions: first time visitors will get a different experience than repeat visitors who come back frequentyly. There is all sort of magic that can be done to control and tweak page caching.
Mobile Analysis in PageSpeed Insights
Mobile Path to Purchase: Five Key Findings (interesting info on how people use mobile for retail)
Amazonâs sales versus others (WSJ)
Top 5 largest online retailers (netonomy.net)