According to Tom Anthony for Moz, Google takes data from Google Chrome users (who have agreed to it) to evaluate site speed through user experience.
In other words, managing websites for load speed from a Google perspective can now be tied in with site speed from a user perspective.
This may seem trivial – even irrelevant – if you’re not used to the industry, but for professionals like our SEO and UX teams at DMJ, that’s a pretty big announcement.
The way Google was always believed to track site speed was not the way you and I would view a site’s realworld load speed. It was always thought that Google solely based (and they probably did!) site speeds on the bot crawling websites.
That’s obviously going to throw up very different graphs than when a user goes to a page, which meant web developers, SEO gurus, website designers, server experts etc etc etc needed to work across how Google’s bots viewed a website as well as how a site responds to actual user requests.
Tom says, “Essentially, certain Chrome users allow their browser to report back load time metrics to Google. The report currently has a public dataset for the top 1 million+ origins, though I imagine they have data for many more domains than are included in the public data set.”
Highlighting how advanced into the data process this information is, Google’s PageSpeed Insights tool already includes the data from Chrome’s User Experience Report.
One of the most obvious implications of all this is, as mentioned above, the ability to apply changes to User Experience that can affect Google’s data on your website’s speed. Tom puts it like this:
“Importantly, this means that there are changes you can make to your site that Googlebot is not capable of detecting, which are still detected by Google and used as a ranking signal. For example, we know that Googlebot does not support HTTP/2 crawling, but now we know that Google will be able to detect the speed improvements you would get from deploying HTTP/2 for your users.”