In 2014, AWS Lambda introduced serverless architecture. Since then, many other cloud providers have developed serverless options. Today, container-based, fully-managed players also share this space with the serverless cloud providers.
What’s behind this rapid growth? Serverless is extremely useful for an increasing number of applications including cloud job automation, serving IoT devices from edge to the cloud, building backend for single page applications (SPA) and image compression.
According to a recent survey, 82 percent in 2018 compared to 45 in 2017 are using serverless at work, suggesting that serverless is definitely here to stay.
As with any new technology, there are also challenges and barriers that are impacting mainstream adoption. Taking a deeper look at both the benefits and challenges of serverless can help network operators decide if it’s right for them and if the potential benefits outweigh the concerns related to network visibility and complexity.
Weighing the Pros and Cons of a Serverless Architecture
Conversely, with serverless, all of the infrastructure control is in the hands of the cloud provider. This results in operational challenges and network visibility blind spots. Compared to the simplicity of containers, virtual machine (VM) or bare-metal architectures, serverless also complicates the network organization and security controls.
Barriers to Mainstream Adoption
Adoption of serverless is growing due to its inherent benefits, but it has not yet become fully mainstream because of some of its limitations
As we previously discussed, adoption of serverless is growing due to its inherent benefits, but it has not yet become fully mainstream because of some of its limitations. Network operators must understand these barriers and vulnerabilities if they plan on reaping the benefits while maintaining a safe and secure serverless solution:
Function Runtime Restrictions
In the few years since its introduction, serverless runtime restrictions have emerged, slowing down the process of building or migrating new or existing applications. This is due to the fact that, in order to create new or adjust existing workflows in a serverless environment, significant warm-up time is needed for each individual change across each function hosted in the complex cloud network.
Self-Regulated Application Organization
For self-regulated applications or microservices, migrating to serverless comes with its own set of challenges. They typically use different types of managed or as-a-service databases to store data across requests; deploying caches like Redis or object storage like S3. With these applications and microservices hosted amongst a variety of different caches, network visibility declines and complexity increases.
Although the burden of patching and maintaining infrastructures is relieved by implementing cloud-hosted serverless functions, the constantly shifting nature of each individual serverless function makes it extremely difficult for developers to establish controls around sensitive data that is always on the move.
These network and visibility challenges not only slow down and complicate operations, they also introduce a number of significant security concerns.
Serverless Security Concerns and Considerations
The main difference between traditional architectures and serverless is that functions rely heavily on non-web, event-based communications and networking channels. Running on public clouds, these event-based communications and channels challenge the implementation of comprehensive security controls that can detect threats and enforce network policies effectively. For serverless functions, new security tools that understand microservices, scale horizontally, and coexist in the existing security stack are required to monitor and scale these new, complex environments.
Before making the decision to go serverless, operations and developers should understand their current network security policies including:
■ Unification around secret consumption
■ Service-to-service authentication and authorization between first and third parties
■ Function workflows and access whitelisting
■ Security network monitoring
■ Access policies to the network and access policies to data
Function-based, serverless workloads are constantly evolving, making them harder to exploit, but it is still important to have a strong pulse on the current state of your network security before moving towards a more fluid and complex computing solution.
Is your Network Ready for Serverless Adoption?
Still in relative infancy, the adoption of serverless architecture continues to grow as companies realize its benefits. Given the limitations outlined in this blog, how do you know if you are ready to implement a serverless framework in your network?
Before jumping head first into serverless, operation teams must understand the visibility blind spots, operational challenges, and potential security threats these complex solutions introduce. Simultaneously, cloud providers must continue to innovate and improve their standards, operations and security measures before serverless adoption will occur seamlessly on community-driven frameworks built on Kubernetes.
If you weigh the pros and cons and end up deciding the current potential benefits for going serverless outweigh the potential risks, understanding the capabilities and challenges associated with each platform provider is key to adopting a solution that works for your complex architecture.
The COVID-19 pandemic has compressed six years of modernization projects into 6 months. According to a recent report, IT leaders have accelerated projects aimed at increasing productivity and business agility, improving application performance and end-user experience, and driving additional revenue through existing channels ...
There is no doubt that automation has become the key aspect of modern IT management. The end-user computing market is no exception. With a large and complex technology stack and a huge number of applications, EUC specialists need to handle an ever-increasing number of changes at an ever-increasing rate. Many IT organizations are starting to realize that they can no longer control the flow of changes. It is time to think about how to facilitate change ...
Starting this September, the lifespan of an SSL/TLS certificate has been limited to 398 days, a reduction from the previous maximum certificate lifetime of 825 days. With this change, everyone needs to more carefully monitor SSL certificate expiration and server characteristics ...
Nearly 6 in 10 responding organizations have accelerated their digital transformations due to the COVID-19 pandemic, according to The IBM Institute for Business Value study COVID-19 and the Future of Business ...
Two-thirds (67%) of those surveyed expect the sheer quantity of data to grow nearly five times by 2025, according to a new report from Splunk: The Data Age Is Here. Are You Ready? ...
Gaming introduced the world to a whole new range of experiences through augmented reality (AR) and virtual reality (VR). And consumers are really catching on. To unlock the potential of these platforms, enterprises must ensure massive amounts of data can be transferred quickly and reliably to ensure an acceptable quality of experience. As such, this means that enterprises will need to turn to a 5G infrastructure powered by an adaptive network ...
A distributed, remote workforce is the new business reality. How can businesses keep operations going smoothly and quickly resolve issues when IT staff is in San Jose, employee A is working remotely in Denver at their home and employee B is a salesperson still doing some road traveling? The key is an IT architecture that promotes and supports "self-healing" at the endpoint to take care of issues before they impact employees. The essential element to achieve this is hyper-automation ...
In Episode 10, Prem Naraindas, CEO of Katonic.ai, joins the AI+ITOPS Podcast to discuss how emerging technologies can make life better for ITOps ...
Sean McDermott on the AI+ITOPS Podcast: "AIOps is really about the processing of vast amounts of data and the ability to move into a more analytical, prescriptive and automated methodology."
The cloud has recently proven to be a vital tool for many organizations to deal with the COVID-19 pandemic by enabling employees to work from home. To me, COVID-19 has clearly shown that work doesn't need to happen at the office. It has strengthened our belief that working from home is going to be the norm for many. The move to the cloud introduces many technical challenges ...