
Corvil announced its Tera Release, the latest version of the Corvil network data analytics platform.
The new platform democratizes the power of network data, with an all-new, intuitive and customizable user interface and a new data automation engine that dramatically reduces the time, expense, and complexity of working with network data.
In addition, the Tera Release adds a new portfolio of real-time security analytics, thus giving Network Operations, Application Operations and Security Operations teams an accurate and collaborative real-time picture of critical service chains across their business.
"We believe that the most effective way for IT to assure and safeguard the delivery of critical applications, services, and data to the business is for all IT teams involved to have a common, trusted, granular source of shared data," said Donal Byrne, CEO, Corvil. "Network data is widely regarded as the most granular and powerful source of real-time data that can be used for this purpose. The challenge is to make network data analytics super-easy, cost-effective and widely available to all. We believe that our new Tera Release achieves this objective with our customers reporting up to 90 percent reduction in time for IT Ops to see, analyze and act on critical business application flows at a cost that is less than what the network team traditionally spends on legacy network probes."
Key innovations in the Corvil Tera Release include:
MULTI TEAM USER INTERFACE - The Tera Release re-imagines the Corvil user experience by providing a new HTML5-based user interface with polished, intuitive, and customizable dashboards that have been optimized to perform workflows for network, application, and security operations professionals jointly responsible for delivery of critical business.
SELF POPULATING DASHBOARDS - The Tera data engine automatically discovers application and business data flows within raw network data with zero configuration. The data in these flows is decoded, transformed, and self-populated into tables and graphical widgets, giving the full picture for what is happening across a business in real-time.
REAL-TIME SECURITY OPERATIONS INTEGRATION - Network data has traditionally been used for network forensics by the security operations team. New thinking in this area suggests that security operations should be leveraging the valuable information contained in network performance monitoring and diagnostic tools. Gartner recently commented: "Network performance monitoring tool data provided by IT operations to security operations for analysis of network forensic information can play a key role in solving security incidents." The new Tera release delivers on this new thinking and goes further by seamlessly integrating live threat intelligence and real-time network forensics with leading SIEM platforms. For example, the Tera release consumes threat intelligence from iSIGHT Partners, and identifies related suspicious activity from streaming analysis of network data. It then forwards these security events into a SIEM platform, like Splunk, using Corvil Streams. The event stream contains associated metadata relating to the threat intel, in addition to a link that allows click-back to Corvil for further retrospective analysis of the security incident.
PROGRAMMABLE STREAM AND/OR STORE NETWORK DATA LAKE - Unlike other platforms in the industry that either capture and store network data before analysis or analyze network data on the fly and then discard the decoded data, the Tera Release is fully user programmable so that customers can decide for themselves how much data to keep, and for how long. The streaming data analytics architecture used by Corvil analyzes all network data on the fly and then programmatically stores both raw network data and enriched network data. The resulting time-synchronized, distributed data store is automatically maintained and managed by the Corvil engines, allowing the user complete flexibility in the creation and management of their network data lake. In addition, the Tera Release now supports a broader array of connectors for streaming Corvil data to big data platforms e.g. Cloudera Enterprise Data Hub.
The Latest
People want to be doing more engaging work, yet their day often gets overrun by addressing urgent IT tickets. But thanks to advances in AI "vibe coding," where a user describes what they want in plain English and the AI turns it into working code, IT teams can automate ticketing workflows and offload much of that work. Password resets that used to take 5 minutes per request now get resolved automatically ...
Governments and social platforms face an escalating challenge: hyperrealistic synthetic media now spreads faster than legacy moderation systems can react. From pandemic-related conspiracies to manipulated election content, disinformation has moved beyond "false text" into the realm of convincing audiovisual deception ...
Traditional monitoring often stops at uptime and server health without any integrated insights. Cross-platform observability covers not just infrastructure telemetry but also client-side behavior, distributed service interactions, and the contextual data that connects them. Emerging technologies like OpenTelemetry, eBPF, and AI-driven anomaly detection have made this vision more achievable, but only if organizations ground their observability strategy in well-defined pillars. Here are the five foundational pillars of cross-platform observability that modern engineering teams should focus on for seamless platform performance ...
For all the attention AI receives in corporate slide decks and strategic roadmaps, many businesses are struggling to translate that ambition into something that holds up at scale. At least, that's the picture that emerged from a recent Forrester study commissioned by Tines ...
From smart factories and autonomous vehicles to real-time analytics and intelligent building systems, the demand for instant, local data processing is exploding. To meet these needs, organizations are leaning into edge computing. The promise? Faster performance, reduced latency and less strain on centralized infrastructure. But there's a catch: Not every network is ready to support edge deployments ...
Every digital customer interaction, every cloud deployment, and every AI model depends on the same foundation: the ability to see, understand, and act on data in real time ... Recent data from Splunk confirms that 74% of the business leaders believe observability is essential to monitoring critical business processes, and 66% feel it's key to understanding user journeys. Because while the unknown is inevitable, observability makes it manageable. Let's explore why ...
Organizations that perform regular audits and assessments of AI system performance and compliance are over three times more likely to achieve high GenAI value than organizations that do not, according to a survey by Gartner ...
Kubernetes has become the backbone of cloud infrastructure, but it's also one of its biggest cost drivers. Recent research shows that 98% of senior IT leaders say Kubernetes now drives cloud spend, yet 91% still can't optimize it effectively. After years of adoption, most organizations have moved past discovery. They know container sprawl, idle resources and reactive scaling inflate costs. What they don't know is how to fix it ...
Artificial intelligence is no longer a future investment. It's already embedded in how we work — whether through copilots in productivity apps, real-time transcription tools in meetings, or machine learning models fueling analytics and personalization. But while enterprise adoption accelerates, there's one critical area many leaders have yet to examine: Can your network actually support AI at the speed your users expect? ...
The more technology businesses invest in, the more potential attack surfaces they have that can be exploited. Without the right continuity plans in place, the disruptions caused by these attacks can bring operations to a standstill and cause irreparable damage to an organization. It's essential to take the time now to ensure your business has the right tools, processes, and recovery initiatives in place to weather any type of IT disaster that comes up. Here are some effective strategies you can follow to achieve this ...