Learning Lessons from Presidential Candidate Website Performance
March 30, 2016

Clay Smith
New Relic

Share this

As the political season continues to heat up, it is the perfect time to check out some of the most watched domain names in the US — the websites of major Democratic and Republican candidates for President of the United States. In the wake of the South Carolina Republican primary on February 20, we set up an automated Web browser to visit campaign sites every 30 minutes and to track the splash screens for campaign contributions, which urge visitors to join the team, family, movement, or revolution.


Political ideology does not seem to have any connection to overall page load time. The average duration of page load time, or the time it takes for a Google Chrome browser to completely load the landing page of each major campaign site, varies widely.


Not surprisingly, there’s evidence that the total size of the Web pages — the sum of all of the responses for images, fonts, HTML, CSS, and JavaScript — affects overall performance. In general, all the campaign sites were image-heavy. All those smiling-supporter photos come at a measurable cost.


Looking at Ted Cruz’s site, for example, segmenting requests by different content types shows that the average load time seems to be most impacted by an increase in JavaScript, CSS, and images served from a single host.


With increasing focus on page weight, reducing response size and the number of requests is critical to improve overall load time performance.

Data suggests that the Clinton, Kasich and Trump campaigns are making less frequent changes to their sites than the Cruz and Sanders sites. Less changes can also reduce overall response time.

According to NPR, some $4.4 billion is expected to be spent on television advertising alone in this election cycle, and the Web is a crucial part of campaign financing. Milliseconds of page load slowdown can result in corresponding page abandonment and lost donations. Given the vast amounts of money being raised through political websites, it is surprising that site performance is as varied as the political positions of the candidates themselves.

For any website, including the ones for the next president of the United States, here are some best practices informed by the performance data that we’ve collected:

■ Understand how your Web pages are working in the real world. Data from simple Selenium scripts reveals a large amount of actionable information to improve the experience of visitors (and donors).

■ When in doubt, reduce Web page bloat. HTTP/2 helps with parallel requests and multiplexing over the same connection, but massive Web pages are slow no matter what.

■ Synthetic testing is a part of a much bigger idea: the power of having visibility into how an entire software stack is actually performing. Truly understanding all the reasons behind slow load times requires end-to-end visibility from the frontend to the backend.

Regardless of whether your websites are hosted in the cloud, at home, or in a state-of-the-art data center, professionals of all political persuasions should be using monitoring tools to build better applications and user experiences.

Methodology: Candidate website performance data was generated using the New Relic Synthetics API, a simple Node.js script and the open-source requests library.

Disclosure: This blog post does not represent the political views of New Relic and should not be taken as an endorsement of any candidate.

Clay Smith is a Developer Advocate at New Relic.

Share this

The Latest

July 09, 2020

Enterprises that halted their cloud migration journey during the current global pandemic are two and a half times more likely than those that continued their move to the cloud to have experienced IT outages that negatively impacted their SLAs, according to Virtana's latest survey report The Current State of Hybrid Cloud and IT ...

July 08, 2020

Every business has the responsibility to do their part against climate change by reducing their carbon footprint while increasing sustainability and efficiency. Harnessing optimization of IT infrastructure is one method companies can use to reduce carbon footprint, improve sustainability and increase business efficiency, while also keeping costs down ...

July 07, 2020

While the adoption of continuous integration (CI) is on the rise, software engineering teams are unable to take a zero-tolerance approach to software failures, costing enterprise organizations billions annually, according to a quantitative study conducted by Undo and a Cambridge Judge Business School MBA project ...

June 25, 2020

I've had the opportunity to work with a number of organizations embarking on their AIOps journey. I always advise them to start by evaluating their needs and the possibilities AIOps can bring to them through five different levels of AIOps maturity. This is a strategic approach that allows enterprises to achieve complete automation for long-term success ...

June 24, 2020

Sumo Logic recently commissioned an independent market research study to understand the industry momentum behind continuous intelligence — and the necessity for digital organizations to embrace a cloud-native, real-time continuous intelligence platform to support the speed and agility of business for faster decision-making, optimizing security, driving new innovation and delivering world-class customer experiences. Some of the key findings include ...

June 23, 2020

When it comes to viruses, it's typically those of the computer/digital variety that IT is concerned about. But with the ongoing pandemic, IT operations teams are on the hook to maintain business functions in the midst of rapid and massive change. One of the biggest challenges for businesses is the shift to remote work at scale. Ensuring that they can continue to provide products and services — and satisfy their customers — against this backdrop is challenging for many ...

June 22, 2020

Teams tasked with developing and delivering software are under pressure to balance the business imperative for speed with high customer expectations for quality. In the course of trying to achieve this balance, engineering organizations rely on a variety of tools, techniques and processes. The 2020 State of Software Quality report provides a snapshot of the key challenges organizations encounter when it comes to delivering quality software at speed, as well as how they are approaching these hurdles. This blog introduces its key findings ...

June 18, 2020

For IT teams, run-the-business, commodity areas such as employee help desks, device support and communication platforms are regularly placed in the crosshairs for cost takeout, but these areas are also highly visible to employees. Organizations can improve employee satisfaction and business performance by building unified functions that are measured by employee experience rather than price. This approach will ultimately fund transformation, as well as increase productivity and innovation ...

June 17, 2020

In the agile DevOps framework, there is a vital piece missing; something that previous approaches to application development did well, but has since fallen by the wayside. That is, the post-delivery portion of the toolchain. Without continuous cloud optimization, the CI/CD toolchain still produces massive inefficiencies and overspend ...

June 16, 2020

The COVID-19 pandemic has exponentially accelerated digital transformation projects. To better understand where IT professionals are turning for help, we analyzed the online behaviors of IT decision-makers. Our research found an increase in demand for resources related to APM, microservices and dependence on cloud services ...