5 New Rules of Network Capacity Planning
July 14, 2014
Matt Goldberg
Share this

The wireless landscape has changed dramatically in a very short period of time. Not only is there greater capacity demand, but wireless networks themselves have become infinitely more complex because of growing interconnectedness, new technology innovations, and shifting patterns of user activity. All of these factors mean that capacity planning models also have to change. There are more variables to monitor and more scenarios to consider. At the same time, the consequences of not being able to accurately predict bandwidth demand loom larger than ever.

Capacity planning has to be a strategic priority, and capacity planning models have to reflect the new realities of network evolution in 2014. The following are five new rules of capacity planning:

1. Know your Backhaul

The cellular backhaul market is one of the fastest growing segments in the mobile industry, thanks to rapid growth in demand, and specifically the need for more capacity to support the transport of local wireless data traffic back to the Internet. Where a bundle of T1 lines to a cell site might have sufficed five years ago, today it's not uncommon to need multiple 10 Gig pipes connected to a single location.

Growth has led to more competition among backhaul providers, but unfortunately, it hasn't necessarily made arranging for new backhaul agreements faster or easier. Providers often sell capacity before they have a chance to build it out, which means it can take months to light up a new link even after a deal is closed.

Wireless carriers need to do significant advance planning in order to prepare for maximum capacity events before they happen. By monitoring traffic and creating threshold alerts at every link, network operators can determine where upgrades are needed and when those upgrades must occur. Carriers should also ensure that the backhaul providers they choose can meet necessary service level agreements. Detailed traffic reports at every backhaul site offer assurance that capacity demands are not only being met in the moment, but that there is room for growth in the future.

2. Be Nimble in Performance Monitoring

Telecom environments are a heterogeneous mix of hardware and software systems. Unfortunately, that diverse technology landscape makes it difficult to maintain end-to-end performance visibility and to understand network utilization at a granular level. With increases in new technologies, network operators need new ways to monitor activity in order to plan capacity upgrades effectively.

Performance monitoring systems should be agnostic in data collection. In addition to relying on standard, out-of-the-box measurement capabilities, carriers need to be able to adapt quickly as new hardware and software gets added to the telecom infrastructure. This means not just being able to monitor standard Cisco or Juniper routers, but also being able to incorporate measurement data from any third-party source, including network probes, proprietary business applications, element management systems, and more. Accurate and timely data reports are critical in capacity planning, and that means carriers have to be able to adapt quickly to avoid performance visibility gaps.

3. Increase your Polling Frequency

Many network monitoring systems still rely on five-minute polling intervals to track bandwidth utilization. However, that cycle length can be highly misleading when it comes to analyzing micro bursts of traffic. A one-second spike in activity, for example, gets flattened out over a five-minute interval, making it difficult to get an accurate picture of bandwidth usage or to diagnose potential latency issues.

By increasing polling frequency, carriers can better see traffic spikes that would otherwise fly under the network management radar. These activity bursts can have a major impact on the customer experience, and need to be factored into capacity planning models. The greater the polling frequency, the more accurate the model.

4. Automate with Algorithms

In order to understand where traffic patterns are headed, a network operator first needs to understand the usage patterns of the past. From a modeling perspective, carriers need to set trending baselines that illustrate normal traffic behavior over many months. Once those baselines are established, it's relatively easy to recognize when activity strays outside the norm. For example, there may be a short-term uptick in bandwidth usage every Fall when college students go back to school, but viewed in the context of an entire year's worth of data, that information doesn't necessarily mean that a carrier needs to increase capacity more quickly than planned.

Capturing traffic data over a long period of time makes it easier to project bandwidth usage in the future. In addition to analyzing individual usage spikes, carriers can use historical data to generate algorithms for more sophisticated projection models. Once created, these algorithms help to automate the process of capacity management, showing network operators where growth is likely to take place well in advance of network overload.

5. Remember, Volume Isn't Everything

Knowing the amount of traffic on a network is important for capacity planning purposes, but so is knowing the composition of that traffic. Understanding the type of activity taking place can make a big difference in investment plans and even monetization strategy. For example, knowing how much customers are utilizing 4G broadband versus 3G can help operators determine how to allocate capacity across different services. Knowing how much bandwidth is being used by a single application can help a carrier analyze whether a different pricing structure would deliver better financial returns.

Capacity planning is a numbers game, but the best projection models take into account the value of different types of traffic. Volume isn't the only important variable.

Bandwidth is a critical resource, and creating an effective capacity planning strategy is well worth the investment. As networks grow more complex, utilization models have to advance as well. Following best practices for capacity planning enables carriers to reduce costs, explore new revenue opportunities, and stay competitive in an increasingly dynamic market.

ABOUT Matt Goldberg

Matt Goldberg is Senior Director of Service Provider Solutions at SevOne, a provider of scalable performance monitoring solutions to the world’s most connected companies.

Share this

The Latest

March 01, 2024

As organizations continue to navigate the complexities of the digital era, which has been marked by exponential advancements in AI and technology, the strategic deployment of modern, practical applications has become indispensable for sustaining competitive advantage and realizing business goals. The Info-Tech Research Group report, Applications Priorities 2024, explores the following five initiatives for emerging and leading-edge technologies and practices that can enable IT and applications leaders to optimize their application portfolio and improve on capabilities needed to meet the ambitions of their organizations ...

February 29, 2024

Despite the growth in popularity of artificial intelligence (AI) and ML across a number of industries, there is still a huge amount of unrealized potential, with many businesses playing catch-up and still planning how ML solutions can best facilitate processes. Further progression could be limited without investment in specialized technical teams to drive development and integration ...

February 28, 2024

With over 200 streaming services to choose from, including multiple platforms featuring similar types of entertainment, users have little incentive to remain loyal to any given platform if it exhibits performance issues. Big names in streaming like Hulu, Amazon Prime and HBO Max invest thousands of hours into engineering observability and closed-loop monitoring to combat infrastructure and application issues, but smaller platforms struggle to remain competitive without access to the same resources ...

February 27, 2024

Generative AI has recently experienced unprecedented dramatic growth, making it one of the most exciting transformations the tech industry has seen in some time. However, this growth also poses a challenge for tech leaders who will be expected to deliver on the promise of new technology. In 2024, delivering tangible outcomes that meet the potential of AI, and setting up incubator projects for the future will be key tasks ...

February 26, 2024

SAP is a tool for automating business processes. Managing SAP solutions, especially with the shift to the cloud-based S/4HANA platform, can be intricate. To explore the concerns of SAP users during operational transformations and automation, a survey was conducted in mid-2023 by Digitate and Americas' SAP Users' Group ...

February 22, 2024

Some companies are just starting to dip their toes into developing AI capabilities, while (few) others can claim they have built a truly AI-first product. Regardless of where a company is on the AI journey, leaders must understand what it means to build every aspect of their product with AI in mind ...

February 21, 2024

Generative AI will usher in advantages within various industries. However, the technology is still nascent, and according to the recent Dynatrace survey there are many challenges and risks that organizations need to overcome to use this technology effectively ...

February 20, 2024

In today's digital era, monitoring and observability are indispensable in software and application development. Their efficacy lies in empowering developers to swiftly identify and address issues, enhance performance, and deliver flawless user experiences. Achieving these objectives requires meticulous planning, strategic implementation, and consistent ongoing maintenance. In this blog, we're sharing our five best practices to fortify your approach to application performance monitoring (APM) and observability ...

February 16, 2024

In MEAN TIME TO INSIGHT Episode 3, Shamus McGillicuddy, VP of Research, Network Infrastructure and Operations, at Enterprise Management Associates (EMA) discusses network security with Chris Steffen, VP of Research Covering Information Security, Risk, and Compliance Management at EMA ...

February 15, 2024

In a time where we're constantly bombarded with new buzzwords and technological advancements, it can be challenging for businesses to determine what is real, what is useful, and what they truly need. Over the years, we've witnessed the rise and fall of various tech trends, such as the promises (and fears) of AI becoming sentient and replacing humans to the declaration that data is the new oil. At the end of the day, one fundamental question remains: How can companies navigate through the tech buzz and make informed decisions for their future? ...