Users Lose Trust in Brands When Websites Take Too Long to Load
September 11, 2015

Pete Goldin

Share this

The three main reasons to distrust an online experience are inaccurate content (91 percent), website downtime (88 percent), and overly simple identity and authentication procedures (75 percent), according to a new report by Neustar in collaboration with the Ponemon Institute. The report affirms a truth long predating the Internet: trust is earned as a direct result of experience.

Overwhelmingly, respondents were dissatisfied with digital storefronts they found to be flawed in their marketing content, overly simple in authentication, or altogether unavailable. These all reflect the converging role that a brand’s Marketing, IT and Security groups play in delivering a seamless customer experience.

Additional report highlights by group include:


■ More than 90 percent of consumers distrust brands with inaccurate online content

■ 55 percent find ads that interfere with the page’s content unacceptable

■ 52 percent find ads that redirect them to other sites unacceptable

Information Technology

■ 78 percent worry about a company’s security when website performance is sluggish

■ 69 percent of respondents have left a website due to security concerns

■ 67 percent of consumers lose trust in a brand when the website takes too long to load


■ 63 percent distrust brands whose data has been breached

■ 50 percent continue to have a negative view of a brand even one year after its breach

■ 55 percent distrust websites that do not have security safeguards such as two-factor authentication

“In our always-on, always-connected world, a brand’s digital storefront may be the first and only touchpoint a customer has with a company,” said Lisa Joy Rosner, CMO at Neustar. “Discerning customers expect a brand’s website to offer them accurate, real-time information, whenever and wherever they want it. On top of that, customers demand that all of the information they share with a brand is retained in a safe and secure manner.”

As consumers spend vastly more time and money online, a trusted digital brand becomes indispensible. As Neustar’s report confirms, brands that do not integrate their Marketing, IT, and Security programs will face an uphill battle as they devote countless resources to building seamless customer experiences without a meaningful result. In the words of Mark Tonnesen, Chief Information and Security Officer at Neustar, “delivering an exceptional brand experience not only depends on a company’s marketing plans; every employee who touches the digital property becomes a steward of the brand.”

Pete Goldin is Editor and Publisher of APMdigest
Share this

The Latest

January 26, 2023

As enterprises work to implement or improve their observability practices, tool sprawl is a very real phenomenon ... Tool sprawl can and does happen all across the organization. In this post, though, we'll focus specifically on how and why observability efforts often result in tool sprawl, some of the possible negative consequences of that sprawl, and we'll offer some advice on how to reduce or even avoid sprawl ...

January 25, 2023

As companies generate more data across their network footprints, they need network observability tools to help find meaning in that data for better decision-making and problem solving. It seems many companies believe that adding more tools leads to better and faster insights ... And yet, observability tools aren't meeting many companies' needs. In fact, adding more tools introduces new challenges ...

January 24, 2023

Driven by the need to create scalable, faster, and more agile systems, businesses are adopting cloud native approaches. But cloud native environments also come with an explosion of data and complexity that makes it harder for businesses to detect and remediate issues before everything comes to a screeching halt. Observability, if done right, can make it easier to mitigate these challenges and remediate incidents before they become major customer-impacting problems ...

January 23, 2023

The spiraling cost of energy is forcing public cloud providers to raise their prices significantly. A recent report by Canalys predicted that public cloud prices will jump by around 20% in the US and more than 30% in Europe in 2023. These steep price increases will test the conventional wisdom that moving to the cloud is a cheap computing alternative ...

January 19, 2023

Despite strong interest over the past decade, the actual investment in DX has been recent. While 100% of enterprises are now engaged with DX in some way, most (77%) have begun their DX journey within the past two years. And most are early stage, with a fourth (24%) at the discussion stage and half (49%) currently transforming. Only 27% say they have finished their DX efforts ...

January 18, 2023

While most thought that distraction and motivation would be the main contributors to low productivity in a work-from-home environment, many organizations discovered that it was gaps in their IT systems that created some of the most significant challenges ...

January 17, 2023
The US aviation sector was struggling to return to normal following a nationwide ground stop imposed by Federal Aviation Administration (FAA) early Wednesday over a computer issue ...
January 13, 2023

APMdigest and leading IT research firm Enterprise Management Associates (EMA) are teaming up on the EMA-APMdigest Podcast, a new podcast focused on the latest technologies impacting IT Operations. In Episode 1, Dan Twing, President and COO of EMA, discusses Observability and Automation with Will Schoeppner, Research Director covering Application Performance Management and Business Intelligence at EMA ...

January 12, 2023

APMdigest is following up our list of 2023 Application Performance Management Predictions with predictions from industry experts about how the cloud will evolve in 2023 ...

January 11, 2023

As demand for digital services increases and distributed systems become more complex, organizations must collect and process a growing amount of observability data (logs, metrics, and traces). Site reliability engineers (SREs), developers, and security engineers use observability data to learn how their applications and environments are performing so they can successfully respond to issues and mitigate risk ...