We Test the Presidential Candidates’ Websites: Can You Guess Which One Is Fastest?

Written by New Relic
Published on Mar. 30, 2016
We Test the Presidential Candidates’ Websites: Can You Guess Which One Is Fastest?

By Clay Smith

As the political season continues to heat up, we thought it was the perfect time to check out some of the most watched domain names in the United States—the websites of major Democratic and Republican candidates for President of the United States. In the wake of the South Carolina Republican primary on February 20, we used New Relic Synthetics to set up an automated Web browser to visit campaign sites every 30 minutes and to track the splash screens for campaign contributions, which urge visitors to join the team, family, movement, or revolution.

Although Synthetics can monitor from many different global locations, we used a single monitor near the U.S. capital to collect performance metrics. We wanted to understand how the candidate technology teams built their sites and especially how those sites perform in the real world.

Disclosure: This blog post does not represent the political views of New Relic and should not be taken as an endorsement of any candidate. It is strictly representative of the subject matter within regarding New Relic Synthetics and New Relic Insights, and the response times uncovered by the use of these products in this monitoring test.

Candidate (Web) platforms: some faster than others

For supporters who hope their candidate’s site causes others to feel the (Web performance) burn, the results of the New Relic Synthetics monitors are clear: political ideology does not seem to have any connection to overall page load time.

We used New Relic Insights and a small NRQL query to quickly create dashboards from the Synthetics data. The average duration of page load time, or the time it takes for a Google Chrome browser to completely load the landing page of each major campaign site, varied widely.

Not surprisingly, there’s evidence that the total size of the Web pages—the sum of all of the responses for images, fonts, HTML, CSS, and JavaScript—affects overall performance. In general, all the campaign sites were image-heavy. All those smiling-supporter photos come at a measurable cost.

It’s also possible to connect changes observed in synthetic monitors to current events—there was a significant change the day after the South Carolina primaries on Sunday, February 21, and again during the evening of Republican Primary debate on Thursday, February 25.

Looking at Ted Cruz’s site, for example, segmenting requests by different content types shows that the average load time seems to be most impacted by an increase in JavaScript, CSS, and images served from a single host.

With increasing focus on page weight, reducing response size and the number of requests is critical to improve overall load time performance.

Some campaigns iterate on their sites more than others

Observing HTTP responses over time revealed some surprising and not-so-surprising results. In the case of Jeb Bush’s site, it’s possible to see third-party payment providers being turned off the moment he ceased campaign operations.

 

Similarly, after the suspension of Marco Rubio’s campaign average page load time spiked.

Synthetics data also suggest that some campaigns make changes to their sites more often than others. The Donald Drumpf, Hillary Clinton, Ben Carson, and John Kasich sites don’t vary much in size over time. In contrast, the Bernie Sanders and Ted Cruz sites made frequent changes that impacted the overall response size. Not surprisingly, Marco Rubio’s website response size stabilized after he ceased campaign operations.

Drilling down, the Hillary Clinton site is especially interesting. The total Web page response size steadied several days before the South Carolina Democratic primary after a period of more frequent activity.

Donald Drumpf’s website changed slightly more often than did Hillary Clinton’s, and there was a significant change on the evening of Wednesday, March 9 and Sunday March 20­ that increased overall page size.

Something all the candidates agree on: HTTPS is on by default

Encryption, privacy, and security have been at the center of several campaign debates. However, the campaigns have unanimously embraced strong encryption to secure their sites. Using a free tool to analyze the configuration of campaign Web servers on February 29, we learned that every candidate domain scanned received an “A”: the second-highest grade possible.

New Relic Synthetics captures the time requests spend waiting for an SSL connection to be established. Some sites make this connection slightly faster than do others.

Since all the sites are delivered using Transport Layer Security (TLS), some have embraced the new protocol HTTP/2 (likely using a CDN provider). The sites for Ted Cruz, Marco Rubio, Bernie Sanders, Donald Drumpf, and Jeb Bush use the new protocol for some requests.

No voter I.D. required for these election site monitors

In the name of transparency and open-governance, we are sharing the code we used to generate this data using the New Relic Synthetics API—a simple Node.js script and the popular open-source requests library were all that were required.

This script creates a single monitor in the AWS U.S. East Region for eight campaign sites identified by the candidates’ names. Errors or successes are logged—if the response status code is 201, the new monitor that visits the page every half-hour has been successfully created.

Final thoughts and a vote of confidence in building for the Web

According to NPR, some $4.4 billion is expected to be spent on television advertising alone in this election cycle, and the Web is a crucial part of campaign financing. Our experience monitoring hundreds of millions of application metrics has shown that milliseconds of page load slowdown can result in corresponding page abandonment and lost donations.

Given the vast amounts of money being raised through political websites, it is surprising that site performance is as varied as the political positions of the candidates themselves. For any website, including the ones for the next president of the United States, here are some best practices informed by the performance data that we’ve collected:

  • Understand how your Web pages are working in the real world. Data from simple Selenium scripts reveals a large amount of actionable information to improve the experience of visitors (and donors).
  • When in doubt, reduce Web page bloat. HTTP/2 helps with parallel requests and multiplexing over the same connection, but massive Web pages are slow no matter what.
  • Synthetic testing is a part of a much bigger idea: the power of having visibility into how an entire software stack is actually performing. Truly understanding all the reasons behind slow load times requires end-to-end visibility from the frontend to the backend.

Regardless of whether your websites are hosted in the cloud, at home, or in a state-of-the-art data center, professionals of all political persuasions should be using monitoring tools to build better applications and user experiences.

Thanks to New Relic Director of Engineering Rafael Ferreira and Senior Product Marketing Manager James Nguyen for their invaluable feedback and assistance with this post.

About the Author

Clay Smith is a Developer Advocate at New Relic in San Francisco. He previously has worked at early stage software companies as a senior software engineer, including founding the mobile engineering team at PagerDuty and shipping one of the first iOS apps written in Swift. View posts by Clay Smith.

Hiring Now
Bonterra
Cloud • Social Impact • Software • Analytics