The total number of datacenters (of all types) in the United States declined for the first time in 2009, falling by 0.7%. triggered by the economic crisis of 2008 and the resultant closing of hundreds and thousands of remote locations with server closets and rooms. At the same time, total datacenter capacity grew by slightly more than 1% as larger datacenter environments continued to rise despite the economic slowdown. According to new research from International Data Corporation (IDC), these trends have continued in the years since 2009 and reflect a major change in datacenter and IT asset deployment that will accelerate further in coming years.
The dynamic driving these changes in the US datacenter market center around the fast-growing array of applications and devices used to communicate and conduct business, the rapid digitization of vast amounts of unstructured data, and the desire to collect, store, and analyze this information in ever-greater volume and detail. This dynamic has had a significant impact on how businesses build, organize, and invest in datacenter facilities and assets.
"CIOs are increasingly being asked to improve business agility while reducing the cost of doing business through aggressive use of technologies in the datacenter," said Rick Villars, vice president, Datacenter and Cloud Research at IDC. "At the same time, they have to ensure the integrity of the business and its information assets in the face of natural disasters, datacenter disruptions, or local system failures. To achieve both sets of objectives, IT decision makers had to rethink their approach to the datacenter."
The most notable factor reshaping datacenter dynamics was the dramatic increase in the use of server virtualization to consolidate server assets. Virtualization and server consolidation drove significant declines in physical datacenter size and eliminated the need for many smaller datacenters as applications were moved to larger central datacenters. It also made investments in power and energy management that much more critical for datacenter managers.
While the aggressive use of virtualization has reduced the rate of growth in server deployments in datacenters, the creation, organization, and distribution of files and rich content are creating a rapid and sustained increase in storage deployments. One of the key characteristics of the content explosion is data centralization, driven by performance, compliance, and scale requirements. As a result, midsize and large datacenters are the main segments where the content explosion is having a major impact.
A third factor shaping the datacenter dynamic has been the shift toward a cloud model for application, platform, and infrastructure delivery. Here the focus is on extending the value and scale of virtualization by boosting operational efficiency and improving IT agility. Along with the content explosion, the buildout of public cloud offerings is driving major growth in the number and size of larger datacenters.
Combined, these factors will continue to drive a slow but steady decline in the number and size of smaller internal datacenters. For similar reasons, large internal datacenters will not grow at anywhere near the same rate as very large datacenters operated by service providers.
By 2016, IDC expects the total number of datacenters in the US will decline from 2.94 million in 2012 to 2.89 million. This decline will be concentrated in internal server rooms and closets, with a very small decline in mid-sized local datacenters.
Despite the slight decline in total datacenters, total datacenter space will increase significantly, growing from 611.4 million square feet in 2012 to more than 700 million square feet in 2016. By the end of the forecast period, IDC expects service providers will account for more than a quarter of all large datacenter capacity in place in the United States.
The IDC report, U.S. Datacenter 2012-2016 Forecast (Doc #237070) provides a census of U.S. datacenters by size, sophistication, and ownership. The report provides a forecast of datacenter investment plans through 2016 and assesses the impact of changing industry business models as well as IT and network developments on datacenter design, build, and management. The report also includes a new datacenter taxonomy based on a multitude of factors, including scope of IT personnel control, physical location, types of applications supported, power and cooling, downtime, floor area, and staff skill sets.
The Latest
Gartner has highlighted the top trends that will impact technology providers in 2024: Generative AI (GenAI) is dominating the technical and product agenda of nearly every tech provider ...
In MEAN TIME TO INSIGHT Episode 4 - Part 1, Shamus McGillicuddy, VP of Research, Network Infrastructure and Operations, at Enterprise Management Associates (EMA) discusses artificial intelligence and network management ...
The integration and maintenance of AI-enabled Software as a Service (SaaS) applications have emerged as pivotal points in enterprise AI implementation strategies, offering both significant challenges and promising benefits. Despite the enthusiasm surrounding AI's potential impact, the reality of its implementation presents hurdles. Currently, over 90% of enterprises are grappling with limitations in integrating AI into their tech stack ...
In the intricate landscape of IT infrastructure, one critical component often relegated to the back burner is Active Directory (AD) forest recovery — an oversight with costly consequences ...
eBPF is a technology that allows users to run custom programs inside the Linux kernel, which changes the behavior of the kernel and makes execution up to 10x faster(link is external) and more efficient for key parts of what makes our computing lives work. That includes observability, networking and security ...
Data mesh, an increasingly important decentralized approach to data architecture and organizational design, focuses on treating data as a product, emphasizing domain-oriented data ownership, self-service tools and federated governance. The 2024 State of the Data Lakehouse report from Dremio presents evidence of the growing adoption of data mesh architectures in enterprises ... The report highlights that the drive towards data mesh is increasingly becoming a business strategy to enhance agility and speed in problem-solving and innovation ...
Too much traffic can crash a website ... That stampede of traffic is even more horrifying when it's part of a malicious denial of service attack ... These attacks are becoming more common, more sophisticated and increasingly tied to ransomware-style demands. So it's no wonder that the threat of DDoS remains one of the many things that keep IT and marketing leaders up at night ...
Today, applications serve as the backbone of businesses, and therefore, ensuring optimal performance has never been more critical. This is where application performance monitoring (APM) emerges as an indispensable tool, empowering organizations to safeguard their applications proactively, match user expectations, and drive growth. But APM is not without its challenges. Choosing to implement APM is a path that's not easily realized, even if it offers great benefits. This blog deals with the potential hurdles that may manifest when you actualize your APM strategy in your IT application environment ...
This year's Super Bowl drew in viewership of nearly 124 million viewers and made history as the most-watched live broadcast event since the 1969 moon landing. To support this spike in viewership, streaming companies like YouTube TV, Hulu and Paramount+ began preparing their IT infrastructure months in advance to ensure an exceptional viewer experience without outages or major interruptions. New Relic conducted a survey to understand the importance of a seamless viewing experience and the impact of outages during major streaming events such as the Super Bowl ...