
Kentik announced the launch of Kentik Synthetic Monitoring, proactive network monitoring that simulates an end-user’s experience with infrastructure, applications or services.
The Kentik Network Intelligence Platform is now the only fully integrated network traffic and synthetic monitoring analytics solution on the market, and the only solution to enable autonomous testing ― for both cloud and hybrid networks.
With Kentik Synthetic Monitoring, network teams have a fully integrated solution that can autonomously configure their tests, present the full network context, and make the resulting insights actionable immediately.
Synthetic testing integrated with actual network traffic and device data gives Kentik trillions of even better eyes on the network.
“Lack of understanding of network usage and state has led to the massive failure of synthetic monitoring,” said Avi Freedman, co-founder and CEO of Kentik. “Kentik already has real-time visibility into over 1 trillion traffic measurements per day across billions of users and sees every network connected to the internet. Synthetic testing integrated with actual network traffic and device data gives Kentik trillions of even better eyes on the network. We are changing the game with synthetic monitoring that’s exponentially more valuable.”
Kentik Synthetic Monitoring uses private agents that deploy quickly and easily and a network of global agents that are strategically positioned in internet cities around the world and in every cloud region within AWS, Google Cloud, Microsoft Azure and IBM Cloud. The service feeds into the Kentik Data Engine (KDE), a patented hybrid columnar and streaming data engine for distributed ingest, enrichment, learning and analytics, which uses machine learning to analyze, predict and respond in real time, at internet scale.
“Data from Kentik Synthetic Monitoring allows us to continue to extend our already insurmountable lead in volume, velocity and quality of network measurement, leveraging the telemetry to build even better models of network, application, and user behavior,” added Freedman.
Kentik Synthetic Monitoring frequently and autonomously measures performance and availability metrics of essential infrastructure, applications and services including:
- SaaS solutions
- Applications hosted in the public cloud
- Internal applications
- Transit and peer networks
- Content delivery networks
- Streaming video, social, gaming and other content providers
- Site-to-site performance across traditional WAN and SD-WANs
- Service provider connectivity and customer SLAs
“Our customers have been vocal for some time that the existing approaches to synthetic network testing are falling short because they are too manual, too static and too expensive,” said Christoph Pfister, CPO of Kentik. “We designed Kentik Synthetics to test autonomously, taking into account the dynamic nature of modern networks and the internet. In addition, we believe the industry has been held back for too long by a lack of affordability, forcing customers to trade off testing needs with cost constraints. Kentik is doing away with all this today by introducing a price point that allows customers to monitor frequently, monitor autonomously, and monitor everything that matters.”
Kentik Synthetic Monitoring is available now in preview, with GA planned for this quarter.
The Latest
From growing reliance on FinOps teams to the increasing attention on artificial intelligence (AI), and software licensing, the Flexera 2025 State of the Cloud Report digs into how organizations are improving cloud spend efficiency, while tackling the complexities of emerging technologies ...
Today, organizations are generating and processing more data than ever before. From training AI models to running complex analytics, massive datasets have become the backbone of innovation. However, as businesses embrace the cloud for its scalability and flexibility, a new challenge arises: managing the soaring costs of storing and processing this data ...
Despite the frustrations, every engineer we spoke with ultimately affirmed the value and power of OpenTelemetry. The "sucks" moments are often the flip side of its greatest strengths ... Part 2 of this blog covers the powerful advantages and breakthroughs — the "OTel Rocks" moments ...
OpenTelemetry (OTel) arrived with a grand promise: a unified, vendor-neutral standard for observability data (traces, metrics, logs) that would free engineers from vendor lock-in and provide deeper insights into complex systems ... No powerful technology comes without its challenges, and OpenTelemetry is no exception. The engineers we spoke with were frank about the friction points they've encountered ...
Enterprises are turning to AI-powered software platforms to make IT management more intelligent and ensure their systems and technology meet business needs for efficiency, lowers costs and innovation, according to new research from Information Services Group ...
The power of Kubernetes lies in its ability to orchestrate containerized applications with unparalleled efficiency. Yet, this power comes at a cost: the dynamic, distributed, and ephemeral nature of its architecture creates a monitoring challenge akin to tracking a constantly shifting, interconnected network of fleeting entities ... Due to the dynamic and complex nature of Kubernetes, monitoring poses a substantial challenge for DevOps and platform engineers. Here are the primary obstacles ...
The perception of IT has undergone a remarkable transformation in recent years. What was once viewed primarily as a cost center has transformed into a pivotal force driving business innovation and market leadership ... As someone who has witnessed and helped drive this evolution, it's become clear to me that the most successful organizations share a common thread: they've mastered the art of leveraging IT advancements to achieve measurable business outcomes ...
More than half (51%) of companies are already leveraging AI agents, according to the PagerDuty Agentic AI Survey. Agentic AI adoption is poised to accelerate faster than generative AI (GenAI) while reshaping automation and decision-making across industries ...

Real privacy protection thanks to technology and processes is often portrayed as too hard and too costly to implement. So the most common strategy is to do as little as possible just to conform to formal requirements of current and incoming regulations. This is a missed opportunity ...
The expanding use of AI is driving enterprise interest in data operations (DataOps) to orchestrate data integration and processing and improve data quality and validity, according to a new report from Information Services Group (ISG) ...