Balancing the Rising Costs of Public Cloud
January 23, 2023

Ahsan Siddiqui
Arcserve

Share this

The spiraling cost of energy is forcing public cloud providers to raise their prices significantly. A recent report by Canalys predicted that public cloud prices will jump by around 20% in the US and more than 30% in Europe in 2023. These steep price increases will test the conventional wisdom that moving to the cloud is a cheap computing alternative.

Indeed, many organizations are already looking at their higher cloud bills and assessing whether it still makes sense to keep moving their infrastructure to the cloud. They do have alternatives.

For instance, for solutions used regularly and persistently, it might make financial sense to bring those in-house rather than host them in the cloud. Owning the infrastructure and managing it yourself could be more cost-effective in the long run.

On the other hand, more complex technologies, and solutions with a high entry cost, such as artificial intelligence, remain good candidates for cloud hosting because they require so much infrastructure and personnel to run in-house. The cloud also remains an excellent option for specific services and solutions where more elasticity is required. This includes technologies that need to be scaled up quickly for a defined period, such as the last few days of each month or quarter when closing the books, then scaled back down.

These are just some issues that organizations should assess when determining if they should keep their data and infrastructure in the cloud. Moving them back on-premises or transitioning to a hybrid infrastructure entails keeping some data and applications in the cloud while returning others to an on-premises infrastructure. From now on, all organizations must take a step back and assess what will work best for them to find the right balance.

The Benefits of Hybrid Cloud

A hybrid cloud has a lot of advantages. Organizations adopting a hybrid cloud approach can more easily control costs and manage their data wherever it resides — on-premises, in a public or private cloud. Many organizations now face a range of emerging trends and threats that impact how they run their business and find the flexibility of a hybrid cloud essential.

A hybrid data center is adaptable. It's a viable and practical system that enables companies to meet the growing threat of ransomware attacks while taking on today's evolving business demands — all in real time. A hybrid data center provides strong security, efficient performance, reliability, scalability, agility, and cost-efficiency.

But a hybrid data center requires work. Implementing and operating one presents several IT-management challenges. Yes, a hybrid data center allows a business to efficiently store and shift workloads according to need and better protect its sensitive data. But a hybrid data center brings more complexity to managing servers, networks, storage, and software across the IT landscape.

For instance, organizations running a hybrid cloud must secure their data and applications both on-premises and in the cloud. They also must be able to recover data and applications on-premises or in the cloud, wherever the company initially hosted the data and applications. And they must handle backup and recovery across a hybrid environment. To do all this, they must have a data management and storage solution that meets the needs of a hybrid data center.

The Rise of Data Repatriation

As the cost of the cloud continues to balloon, many companies will take the dramatic step of "repatriating" workloads to preserve precious IT budgets. Already, rising energy prices are forcing organizations to rethink their cloud strategy and start repatriating their data from the cloud to on-premises.

Indeed, market intelligence firm IDC research shows that most organizations are now shifting workloads from the cloud back to on-premises data centers. In the IDC survey, 71% of respondents said they plan to move some or all of the workloads they're now running in public clouds back to on-premises environments in the next two years. A mere 13% said they plan to run all their workloads in the cloud.

There are many reasons why companies are repatriating their workloads from the cloud to on-premises. These include security, performance, regulatory compliance, and a desire for better control of the IT infrastructure. Another reason is cost, which can rise quickly and unexpectedly. Workloads often start small and demand a manageable expenditure, but when workloads jump — which they frequently do — so does the spending, which a company may not have anticipated.

Data volumes in the cloud have increased to a point where they're often not manageable. Moving some of this data back on premises can bring benefits beyond lower costs, such as better security and enhanced performance.

But as companies move their data back on-premises, they face several challenges. They need a data-storage solution that can protect their data wherever it resides — on-premises, offsite, or in the cloud. They also need a storage solution that ensures their data is available 24/7/365, even in unforeseen circumstances.

Ideally, they also need a storage solution that provides analytics that can rapidly decide what sets of data are critical to operations and what sets are not. With these analytics, organizations can efficiently determine which datasets they can place in the cloud, which can be stored locally, and which they should bring back on-premises. Analytics also enables companies to decide which data they must back up and which doesn't. With this, organizations can maintain an intelligent, tiered data architecture that ensures quick access to critical data and saves costs by identifying data they can store in less expensive, less readily accessible media.

Your To-Do List for Cloud Deployment in 2023

As cloud costs rise, organizations must reexamine their data storage systems. They must implement solutions that enable them to manage their workloads cost-effectively and, at the same time, ensure that their data is always accessible and secure.

Ahsan Siddiqui is Director of Product Management at Arcserve
Share this

The Latest

October 04, 2024

In Part 1 of this two-part series, I defined multi-CDN and explored how and why this approach is used by streaming services, e-commerce platforms, gaming companies and global enterprises for fast and reliable content delivery ... Now, in Part 2 of the series, I'll explore one of the biggest challenges of multi-CDN: observability.

October 03, 2024

CDNs consist of geographically distributed data centers with servers that cache and serve content close to end users to reduce latency and improve load times. Each data center is strategically placed so that digital signals can rapidly travel from one "point of presence" to the next, getting the digital signal to the viewer as fast as possible ... Multi-CDN refers to the strategy of utilizing multiple CDNs to deliver digital content across the internet ...

October 02, 2024

We surveyed IT professionals on their attitudes and practices regarding using Generative AI with databases. We asked how they are layering the technology in with their systems, where it's working the best for them, and what their concerns are ...

October 01, 2024

40% of generative AI (GenAI) solutions will be multimodal (text, image, audio and video) by 2027, up from 1% in 2023, according to Gartner ...

September 30, 2024

Today's digital business landscape evolves rapidly ... Among the areas primed for innovation, the long-standing ticket-based IT support model stands out as particularly outdated. Emerging as a game-changer, the concept of the "ticketless enterprise" promises to shift IT management from a reactive stance to a proactive approach ...

September 27, 2024

In MEAN TIME TO INSIGHT Episode 10, Shamus McGillicuddy, VP of Research, Network Infrastructure and Operations, at EMA discusses Generative AI ...

September 26, 2024

By 2026, 30% of enterprises will automate more than half of their network activities, an increase from under 10% in mid-2023, according to Gartner ...

September 25, 2024

A recent report by Enterprise Management Associates (EMA) reveals that nearly 95% of organizations use a combination of do-it-yourself (DIY) and vendor solutions for network automation, yet only 28% believe they have successfully implemented their automation strategy. Why is this mixed approach so popular if many engineers feel that their overall program is not successful? ...

September 24, 2024

As AI improves and strengthens various product innovations and technology functions, it's also influencing and infiltrating the observability space ... Observability helps translate technical stability into customer satisfaction and business success and AI amplifies this by driving continuous improvement at scale ...

September 23, 2024

Technical debt is a pressing issue for many organizations, stifling innovation and leading to costly inefficiencies ... Despite these challenges, 90% of IT leaders are planning to boost their spending on emerging technologies like AI in 2025 ... As budget season approaches, it's important for IT leaders to address technical debt to ensure that their 2025 budgets are allocated effectively and support successful technology adoption ...