Skip to main content

Dataflow Complexity is the New Normal

Pat Patterson

The process of wrangling big data is fraught with pitfalls for enterprises. Data-driven enterprises are buckling under the burden of gathering, analyzing and making actionable an incredible and growing amount of data flowing in from a variety of sources. It's not just the amount of big data that is confounding data-driven companies: The speed at which data must be collected and analyzed, and the variety of data types (think: IoT sensors, log files, web clickstreams) are overwhelming enterprise data architectures, which are increasingly defined by a complex tangle of big data sources and processing systems. Topping all this is the problem of data drift, the unexpected changes that consistently plague big data sources and result in corrupt and unusable data.

In short, the complexity of data in motion is growing and risks undermining the success of the modern data-driven enterprise. A recent survey of data engineers and architects, conducted by StreamSets, sought to bring some perspective to the new reality in the enterprise, leading to some interesting insights about the enterprise data landscape.

As we expected, use of streaming data has become quite common, with a high number of respondents — 72 percent — collecting this data for a variety of uses. Of these, two-thirds (48 percent) collect a combination of batch and streaming data, since real-time data requires context to provide intelligence. In contrast, 28 percent move batch data only, and 24 percent ingest streaming data only.

Survey results also showed that enterprises are gathering data from a range of sources: 61 percent collect from transactional databases, 53 percent from log files, 42 percent from analytics databases, 27 percent from clickstream data and 18 percent from IoT devices.

Moving on from their use of streaming data, the survey reveals that enterprises are also experiencing a sense of data urgency — that is, expeditious analysis of their incoming data sets. In fact, according to the survey, 56 percent of respondents say they require data analysis within minutes of receiving the data, and 16 percent require analysis within seconds. The world has certainly evolved from the daily or weekly business intelligence report to a live dashboard, or even analysis that drives automated actions like website personalization that can have a direct impact on a business' effectiveness in engaging with its customers. These requirements put extreme pressure on enterprise data architectures not necessarily designed to deliver consumption-ready data with this type of speed.

Our survey responses indicate that enterprises funnel their data into a range of destinations, making them much more complicated and expensive to manage than ever before. In addition, respondents keep some of their data on premises (58 percent), some in private clouds (48 percent) and some in public clouds (27 percent). The combination of diverse data stores and multiple deployment models is a new phenomenon we call data sprawl, and it is a key driver of dataflow complexity.

The challenges of increased dataflow complexity are here and now and, given the unprecedented growth of data each day, must be considered the new normal. With this information as a bird's-eye view of the state of data in motion, savvy enterprises will adopt technologies and solutions that will help them evolve with the big data landscape.

Pat Patterson is Community Champion at StreamSets.

Hot Topics

The Latest

UK IT leaders are reaching a critical inflection point in how they manage observability, according to research from LogicMonitor. As infrastructure complexity grows and AI adoption accelerates, fragmented monitoring environments are driving organizations to rethink their operational strategies and consolidate tools ...

For years, many infrastructure teams treated the edge as a deployment variation. It was seen as the same cloud model, only stretched outward: more devices, more gateways, more locations and a little more latency. That assumption is proving costly. The edge is not just another place to run workloads. It is a fundamentally different operating condition ...

AI can't fix broken data. CIOs who modernize revenue data governance unlock predictable growth-those who don't risk millions in failed AI investments. For decades, CIOs kept the lights on. Revenue was someone else's problem, owned by sales, led by the CRO, measured by finance. Those days are behind us ...

Over the past few years, organizations have made enormous strides in enabling remote and hybrid work. But the foundational technologies powering today's digital workplace were never designed for the volume, velocity, and complexity that is coming next. By 2026 and beyond, three forces — 5G, the metaverse, and edge AI — will fundamentally reshape how people connect, collaborate, and access enterprise resources ... The businesses that begin preparing now will gain a competitive head start. Those that wait will find themselves trying to secure environments that have already outgrown their architecture ...

Ask where enterprise AI is making its most decisive impact, and the answer might surprise you: not marketing, not finance, not customer experience. It's IT. Across three years of industry research conducted by Digitate, one constant holds true is that IT is both the testing ground and the proving ground for enterprise AI. Last year, that position only strengthened ...

A payment gateway fails at 2 AM. Thousands of transactions hang in limbo. Post-mortems reveal failures cascading across dozens of services, each technically sound in isolation. The diagnosis takes hours. The fix requires coordinated deployments across teams ...

Every enterprise technology conversation right now circles back to AI agents. And for once, the excitement isn't running too far ahead of reality. According to a Zapier survey of over 500 enterprise leaders, 72% of enterprises are already using or testing AI agents, and 84% plan to increase their investment over the next 12 months. Those numbers are big. But they also raise a question that doesn't get asked enough: what exactly are companies doing with these agents, and are they actually getting value from them? ...

Many organizations still rely on reactive availability models, taking action only after an outage occurs. However, as applications become more complex, this approach often leads to delayed detection, prolonged disruption, and incomplete recovery. Monitoring is evolving from a basic operational function into a foundational capability for sustaining availability in modern environments ...

In MEAN TIME TO INSIGHT Episode 22, Shamus McGillicuddy, VP of Research, Network Infrastructure and Operations, at EMA discusses DNS Security ... 

The financial stakes of extended service disruption has made operational resilience a top priority, according to 2026 State of AI-First Operations Report, a report from PagerDuty. According to survey findings, 95% of respondents believe their leadership understands the competitive advantage that can be gained from reducing incidents and speeding recovery ...

Dataflow Complexity is the New Normal

Pat Patterson

The process of wrangling big data is fraught with pitfalls for enterprises. Data-driven enterprises are buckling under the burden of gathering, analyzing and making actionable an incredible and growing amount of data flowing in from a variety of sources. It's not just the amount of big data that is confounding data-driven companies: The speed at which data must be collected and analyzed, and the variety of data types (think: IoT sensors, log files, web clickstreams) are overwhelming enterprise data architectures, which are increasingly defined by a complex tangle of big data sources and processing systems. Topping all this is the problem of data drift, the unexpected changes that consistently plague big data sources and result in corrupt and unusable data.

In short, the complexity of data in motion is growing and risks undermining the success of the modern data-driven enterprise. A recent survey of data engineers and architects, conducted by StreamSets, sought to bring some perspective to the new reality in the enterprise, leading to some interesting insights about the enterprise data landscape.

As we expected, use of streaming data has become quite common, with a high number of respondents — 72 percent — collecting this data for a variety of uses. Of these, two-thirds (48 percent) collect a combination of batch and streaming data, since real-time data requires context to provide intelligence. In contrast, 28 percent move batch data only, and 24 percent ingest streaming data only.

Survey results also showed that enterprises are gathering data from a range of sources: 61 percent collect from transactional databases, 53 percent from log files, 42 percent from analytics databases, 27 percent from clickstream data and 18 percent from IoT devices.

Moving on from their use of streaming data, the survey reveals that enterprises are also experiencing a sense of data urgency — that is, expeditious analysis of their incoming data sets. In fact, according to the survey, 56 percent of respondents say they require data analysis within minutes of receiving the data, and 16 percent require analysis within seconds. The world has certainly evolved from the daily or weekly business intelligence report to a live dashboard, or even analysis that drives automated actions like website personalization that can have a direct impact on a business' effectiveness in engaging with its customers. These requirements put extreme pressure on enterprise data architectures not necessarily designed to deliver consumption-ready data with this type of speed.

Our survey responses indicate that enterprises funnel their data into a range of destinations, making them much more complicated and expensive to manage than ever before. In addition, respondents keep some of their data on premises (58 percent), some in private clouds (48 percent) and some in public clouds (27 percent). The combination of diverse data stores and multiple deployment models is a new phenomenon we call data sprawl, and it is a key driver of dataflow complexity.

The challenges of increased dataflow complexity are here and now and, given the unprecedented growth of data each day, must be considered the new normal. With this information as a bird's-eye view of the state of data in motion, savvy enterprises will adopt technologies and solutions that will help them evolve with the big data landscape.

Pat Patterson is Community Champion at StreamSets.

Hot Topics

The Latest

UK IT leaders are reaching a critical inflection point in how they manage observability, according to research from LogicMonitor. As infrastructure complexity grows and AI adoption accelerates, fragmented monitoring environments are driving organizations to rethink their operational strategies and consolidate tools ...

For years, many infrastructure teams treated the edge as a deployment variation. It was seen as the same cloud model, only stretched outward: more devices, more gateways, more locations and a little more latency. That assumption is proving costly. The edge is not just another place to run workloads. It is a fundamentally different operating condition ...

AI can't fix broken data. CIOs who modernize revenue data governance unlock predictable growth-those who don't risk millions in failed AI investments. For decades, CIOs kept the lights on. Revenue was someone else's problem, owned by sales, led by the CRO, measured by finance. Those days are behind us ...

Over the past few years, organizations have made enormous strides in enabling remote and hybrid work. But the foundational technologies powering today's digital workplace were never designed for the volume, velocity, and complexity that is coming next. By 2026 and beyond, three forces — 5G, the metaverse, and edge AI — will fundamentally reshape how people connect, collaborate, and access enterprise resources ... The businesses that begin preparing now will gain a competitive head start. Those that wait will find themselves trying to secure environments that have already outgrown their architecture ...

Ask where enterprise AI is making its most decisive impact, and the answer might surprise you: not marketing, not finance, not customer experience. It's IT. Across three years of industry research conducted by Digitate, one constant holds true is that IT is both the testing ground and the proving ground for enterprise AI. Last year, that position only strengthened ...

A payment gateway fails at 2 AM. Thousands of transactions hang in limbo. Post-mortems reveal failures cascading across dozens of services, each technically sound in isolation. The diagnosis takes hours. The fix requires coordinated deployments across teams ...

Every enterprise technology conversation right now circles back to AI agents. And for once, the excitement isn't running too far ahead of reality. According to a Zapier survey of over 500 enterprise leaders, 72% of enterprises are already using or testing AI agents, and 84% plan to increase their investment over the next 12 months. Those numbers are big. But they also raise a question that doesn't get asked enough: what exactly are companies doing with these agents, and are they actually getting value from them? ...

Many organizations still rely on reactive availability models, taking action only after an outage occurs. However, as applications become more complex, this approach often leads to delayed detection, prolonged disruption, and incomplete recovery. Monitoring is evolving from a basic operational function into a foundational capability for sustaining availability in modern environments ...

In MEAN TIME TO INSIGHT Episode 22, Shamus McGillicuddy, VP of Research, Network Infrastructure and Operations, at EMA discusses DNS Security ... 

The financial stakes of extended service disruption has made operational resilience a top priority, according to 2026 State of AI-First Operations Report, a report from PagerDuty. According to survey findings, 95% of respondents believe their leadership understands the competitive advantage that can be gained from reducing incidents and speeding recovery ...