Most websites that rank in Google's top 20 search results do not currently pass the minimum requirements for a good page experience set out in the search engine's new Core Web Vitals update which officially rolls out from mid-June, according to new research from Searchmetrics.
96% of sites tested in US desktop searches and more than 90% of those in mobile searches fail to meet Google's three Core Web Vitals thresholds for good website performance and usability and risk their rankings being negatively impacted from June.
Currently only the top 2 or 3 ranking websites in search results achieve the required "good" score in most of the Core Web Vitals metrics according to the analysis by Searchmetrics.
The study which analyzed over 2 million web pages appearing in the top 20 Google results in the US, UK and Germany, reveals that there is already a positive relationship between pages that rank higher and those that perform well on Core Web Vitals metrics (suggesting Google already rewards sites that offer better usability). Once the update officially rolls out, the Core Web Vitals are likely to have even more influence on page rankings, according to Searchmetrics.
One notable exception to the general trend of the early findings was Google-owned YouTube, which was currently found to rank high in searches despite performing poorly for the Core Web Vitals related to speed of loading and responsiveness. This may change when the update officially rolls out in June, but YouTube might possibly be gaining an advantage from its strong brand recognition which is helping it overcome individual negative usability issues.
On the other hand, online encyclopaedia, Wikipedia, currently performs well for all the Core Web Vitals metrics — it might be a good example of the type of user experience others should be aiming for. Its good scores are related to its lightweight approach to web design, using mainly text and optimized images and the fact that it avoids using too much dynamic content such as ads which can jump around on the page and create a negative experience.
Google is introducing Core Web Vitals to assess real-world web user experience in three areas:
■ How quickly the content on a page loads
■ Responsiveness: the time taken to respond to a visitor's first interaction, such as clicking on a button or a link
■ Visual stability: does the layout or content jump around?
These signals will be included in Google's search algorithm with the search engine aiming to deliver a ranking boost to web pages that are delivering a good experience.
"The Google Core Web Vitals update is in many ways a response to websites not really living up to user expectations. It's a clear message to website owners that not putting users first may have a negative effect on rankings," said Marcus Tober, Founder and Chief Evangelist at Searchmetrics. "Our initial findings suggest that there's a lot of work to do for most websites to get their usability up to par. And, of course, ecommerce and other enterprises need to be aware that a good user experience will not just influence their Google rankings but have a positive business impact, it can help to drive conversions and encourage visitors to stay onsite longer, engage and keep returning."
According to Searchmetrics among the reasons behind the poor user experience highlighted by the study is the rise of "Code Bloat" or unnecessary code on webpages built using templates included within website builders such as WordPress and Wix, as well additional code in web plugins, all of which slows pages down and creates optimization challenges.
Another issue is dynamic content such as ads and newsletter opt-in boxes which can cause the layout of pages to shift if they are not implemented properly.
Five key findings based on the US findings of the research include:
Only sites ranked 1-3 deliver a good user experience for loading important page content quickly
Largest Contentful Paint measures the time it takes for the largest image or block of text to become visible when a user clicks onto a page. For a good user experience, Google suggests this should happen within the first 2.5 seconds. But of the top 20 ranking websites in search results, only the first 3 positions are below this threshold. The average time for pages listed in the top 20 positions is 3 seconds (21.3% slower than Google's benchmark).
Most sites can't pass the test for controlling shifting elements on their pages
Cumulative Layout Shift tracks how much the elements on a page jump about or shift creating a negative user experience. And the analysis indicates most sites perform poorly on this. Google specifies a score below 0.1 as "good", while below 0.25 "needs improvement" — everything else is "poor". The main causes of layout shifts include dynamic content and media pop-outs such as "subscribe now" boxes or ads. According to the data, only the position zero results (featured snippets) which Google places above the traditional organic results to provide quick answers to factual queries, achieves a "good" score. Position one is close, but all other search results fall into the "poor" bracket (below 0.25). The average score or the top 20 results is 0.38 (275.6% worse than Google's required "good" rating).
Majority of sites fall short of Google's benchmark for good responsiveness
First Input Delay measures the time it takes for a page to respond to a visitor interaction such as someone clicking on a button or a link. Because this can only be measured if a user actually interacts with a page, Google suggests using the Total Blocking Time (TBT) as a good proxy measurement. TBT assesses the total time taken by tasks that stop the user interacting with a page (such as retrieving images, videos, scripts etc).
And the research suggests that the top 5 ranking results have an average total blocking time of 554 milliseconds — 84.6% slower than Google's "good" benchmark of 300 milliseconds (the average for top 20 search results is 136.7% slower that the benchmark). Only sites that appear in the top 2 rankings are consistently below Google's "good" performance threshold, implying that there are a lot of asset-heavy websites with long loading tasks that delay user responsiveness.
YouTube ranks high despite currently showing poor Core Web Vitals
YouTube was found to currently have poor Core Web Vitals scores around speed of loading (LCP) and responsiveness (FID). If other websites performed as badly, they would be judged to be offering a low-quality user experience. But YouTube is ranking high despite this, most likely because of the platform's overwhelming popularity, which helps it deliver positive user signals. Most other sites could not afford to score so low on individual user experience metrics because they do not have the luxury of YouTube's extreme brand recognition according to Searchmetrics.
Wikipedia could be the "Poster Boy" of the new Core Web Vitals update
Methodology: Searchmetrics crawled over 2 million URLs, crunched the numbers, and performed correlation analysis across the top 20 organic Google search positions. in three countries: the USA, UK and Germany. The keywords were filtered based on relevance and the regional keyword sets were kept as distinct as possible. Core Web Vitals and other performance metrics for the search results were measured by accessing the PageSpeed and Lighthouse APIs. Once the data had been gathered, Searchmetrics performed a correlation analysis to calculate a correlation coefficient for each performance metric - this value gives an impression of whether a good score in the selected metric is associated with good page rankings. Simple averages for the top 20 and top 5 positions were calculated per metric, as well as how close these values are to Google’s benchmarks.
Incident management processes are not keeping pace with the demands of modern operations teams, failing to meet the needs of SREs as well as platform and ops teams. Results from the State of DevOps Automation and AI Survey, commissioned by Transposit, point to an incident management paradox. Despite nearly 60% of ITOps and DevOps professionals reporting they have a defined incident management process that's fully documented in one place and over 70% saying they have a level of automation that meets their needs, teams are unable to quickly resolve incidents ...
Today, in the world of enterprise technology, the challenges posed by legacy Virtual Desktop Infrastructure (VDI) systems have long been a source of concern for IT departments. In many instances, this promising solution has become an organizational burden, hindering progress, depleting resources, and taking a psychological and operational toll on employees ...
Within retail organizations across the world, IT teams will be bracing themselves for a hectic holiday season ... While this is an exciting opportunity for retailers to boost sales, it also intensifies severe risk. Any application performance slipup will cause consumers to turn their back on brands, possibly forever. Online shoppers will be completely unforgiving to any retailer who doesn't deliver a seamless digital experience ...
Black Friday is a time when consumers can cash in on some of the biggest deals retailers offer all year long ... Nearly two-thirds of consumers utilize a retailer's web and mobile app for holiday shopping, raising the stakes for competitors to provide the best online experience to retain customer loyalty. Perforce's 2023 Black Friday survey sheds light on consumers' expectations this time of year and how developers can properly prepare their applications for increased online traffic ...
This holiday shopping season, the stakes for online retailers couldn't be higher ... Even an hour or two of downtime for a digital storefront during this critical period can cost millions in lost revenue and has the potential to damage brand credibility. Savvy retailers are increasingly investing in observability to help ensure a seamless, omnichannel customer experience. Just ahead of the holiday season, New Relic released its State of Observability for Retail report, which offers insight and analysis on the adoption and business value of observability for the global retail/consumer industry ...
As organizations struggle to find and retain the talent they need to manage complex cloud implementations, many are leaning toward hybrid cloud as a solution ... While it's true that using the cloud is not a "one size fits all" proposition, it is clear that both large and small companies prefer a hybrid cloud model ...
In the same way a city is a sum of its districts and neighborhoods, complex IT systems are made of many components that continually interact. Observability requires a comprehensive and connected view of all aspects of the system, including even some that don't directly relate to its technological innards ...
Multicasting in this context refers to the process of directing data streams to two or more destinations. This might look like sending the same telemetry data to both an on-premises storage system and a cloud-based observability platform concurrently. The two principal benefits of this strategy are cost savings and service redundancy ...
In today's rapidly evolving business environment, Chief Information Officers (CIOs) and Chief Technology Officers (CTOs) are grappling with the challenge of regaining control over their IT roadmap. The constant evolution and introduction of new technology releases, combined with the pressure to deliver innovation on shrinking budgets, has added layers of complexity for executives who must transform the perception of the role of the IT leader from cost managers and maintainers to strategic enablers of growth and profitability ...
Artificial intelligence (AI) has saturated the conversation around technology as compelling new tools like ChatGPT produce headlines every day. Enterprise leaders have correctly identified the potential of AI — and its many tributary technologies — to generate new efficiencies at scale, particularly in the cloud era. But as we now know, these technologies are rarely plug-and-play, for reasons both technical and human ...