Skip to main content

Dataflow Complexity is the New Normal

Pat Patterson

The process of wrangling big data is fraught with pitfalls for enterprises. Data-driven enterprises are buckling under the burden of gathering, analyzing and making actionable an incredible and growing amount of data flowing in from a variety of sources. It's not just the amount of big data that is confounding data-driven companies: The speed at which data must be collected and analyzed, and the variety of data types (think: IoT sensors, log files, web clickstreams) are overwhelming enterprise data architectures, which are increasingly defined by a complex tangle of big data sources and processing systems. Topping all this is the problem of data drift, the unexpected changes that consistently plague big data sources and result in corrupt and unusable data.

In short, the complexity of data in motion is growing and risks undermining the success of the modern data-driven enterprise. A recent survey of data engineers and architects, conducted by StreamSets, sought to bring some perspective to the new reality in the enterprise, leading to some interesting insights about the enterprise data landscape.

As we expected, use of streaming data has become quite common, with a high number of respondents — 72 percent — collecting this data for a variety of uses. Of these, two-thirds (48 percent) collect a combination of batch and streaming data, since real-time data requires context to provide intelligence. In contrast, 28 percent move batch data only, and 24 percent ingest streaming data only.

Survey results also showed that enterprises are gathering data from a range of sources: 61 percent collect from transactional databases, 53 percent from log files, 42 percent from analytics databases, 27 percent from clickstream data and 18 percent from IoT devices.

Moving on from their use of streaming data, the survey reveals that enterprises are also experiencing a sense of data urgency — that is, expeditious analysis of their incoming data sets. In fact, according to the survey, 56 percent of respondents say they require data analysis within minutes of receiving the data, and 16 percent require analysis within seconds. The world has certainly evolved from the daily or weekly business intelligence report to a live dashboard, or even analysis that drives automated actions like website personalization that can have a direct impact on a business' effectiveness in engaging with its customers. These requirements put extreme pressure on enterprise data architectures not necessarily designed to deliver consumption-ready data with this type of speed.

Our survey responses indicate that enterprises funnel their data into a range of destinations, making them much more complicated and expensive to manage than ever before. In addition, respondents keep some of their data on premises (58 percent), some in private clouds (48 percent) and some in public clouds (27 percent). The combination of diverse data stores and multiple deployment models is a new phenomenon we call data sprawl, and it is a key driver of dataflow complexity.

The challenges of increased dataflow complexity are here and now and, given the unprecedented growth of data each day, must be considered the new normal. With this information as a bird's-eye view of the state of data in motion, savvy enterprises will adopt technologies and solutions that will help them evolve with the big data landscape.

Pat Patterson is Community Champion at StreamSets.

Hot Topics

The Latest

Businesses that face downtime or outages risk financial and reputational damage, as well as reducing partner, shareholder, and customer trust. One of the major challenges that enterprises face is implementing a robust business continuity plan. What's the solution? The answer may lie in disaster recovery tactics such as truly immutable storage and regular disaster recovery testing ...

IT spending is expected to jump nearly 10% in 2025, and organizations are now facing pressure to manage costs without slowing down critical functions like observability. To meet the challenge, leaders are turning to smarter, more cost effective business strategies. Enter stage right: OpenTelemetry, the missing piece of the puzzle that is no longer just an option but rather a strategic advantage ...

Amidst the threat of cyberhacks and data breaches, companies install several security measures to keep their business safely afloat. These measures aim to protect businesses, employees, and crucial data. Yet, employees perceive them as burdensome. Frustrated with complex logins, slow access, and constant security checks, workers decide to completely bypass all security set-ups ...

Image
Cloudbrink's Personal SASE services provide last-mile acceleration and reduction in latency

In MEAN TIME TO INSIGHT Episode 13, Shamus McGillicuddy, VP of Research, Network Infrastructure and Operations, at EMA discusses hybrid multi-cloud networking strategy ... 

In high-traffic environments, the sheer volume and unpredictable nature of network incidents can quickly overwhelm even the most skilled teams, hindering their ability to react swiftly and effectively, potentially impacting service availability and overall business performance. This is where closed-loop remediation comes into the picture: an IT management concept designed to address the escalating complexity of modern networks ...

In 2025, enterprise workflows are undergoing a seismic shift. Propelled by breakthroughs in generative AI (GenAI), large language models (LLMs), and natural language processing (NLP), a new paradigm is emerging — agentic AI. This technology is not just automating tasks; it's reimagining how organizations make decisions, engage customers, and operate at scale ...

In the early days of the cloud revolution, business leaders perceived cloud services as a means of sidelining IT organizations. IT was too slow, too expensive, or incapable of supporting new technologies. With a team of developers, line of business managers could deploy new applications and services in the cloud. IT has been fighting to retake control ever since. Today, IT is back in the driver's seat, according to new research by Enterprise Management Associates (EMA) ...

In today's fast-paced and increasingly complex network environments, Network Operations Centers (NOCs) are the backbone of ensuring continuous uptime, smooth service delivery, and rapid issue resolution. However, the challenges faced by NOC teams are only growing. In a recent study, 78% state network complexity has grown significantly over the last few years while 84% regularly learn about network issues from users. It is imperative we adopt a new approach to managing today's network experiences ...

Image
Broadcom

From growing reliance on FinOps teams to the increasing attention on artificial intelligence (AI), and software licensing, the Flexera 2025 State of the Cloud Report digs into how organizations are improving cloud spend efficiency, while tackling the complexities of emerging technologies ...

Today, organizations are generating and processing more data than ever before. From training AI models to running complex analytics, massive datasets have become the backbone of innovation. However, as businesses embrace the cloud for its scalability and flexibility, a new challenge arises: managing the soaring costs of storing and processing this data ...

Dataflow Complexity is the New Normal

Pat Patterson

The process of wrangling big data is fraught with pitfalls for enterprises. Data-driven enterprises are buckling under the burden of gathering, analyzing and making actionable an incredible and growing amount of data flowing in from a variety of sources. It's not just the amount of big data that is confounding data-driven companies: The speed at which data must be collected and analyzed, and the variety of data types (think: IoT sensors, log files, web clickstreams) are overwhelming enterprise data architectures, which are increasingly defined by a complex tangle of big data sources and processing systems. Topping all this is the problem of data drift, the unexpected changes that consistently plague big data sources and result in corrupt and unusable data.

In short, the complexity of data in motion is growing and risks undermining the success of the modern data-driven enterprise. A recent survey of data engineers and architects, conducted by StreamSets, sought to bring some perspective to the new reality in the enterprise, leading to some interesting insights about the enterprise data landscape.

As we expected, use of streaming data has become quite common, with a high number of respondents — 72 percent — collecting this data for a variety of uses. Of these, two-thirds (48 percent) collect a combination of batch and streaming data, since real-time data requires context to provide intelligence. In contrast, 28 percent move batch data only, and 24 percent ingest streaming data only.

Survey results also showed that enterprises are gathering data from a range of sources: 61 percent collect from transactional databases, 53 percent from log files, 42 percent from analytics databases, 27 percent from clickstream data and 18 percent from IoT devices.

Moving on from their use of streaming data, the survey reveals that enterprises are also experiencing a sense of data urgency — that is, expeditious analysis of their incoming data sets. In fact, according to the survey, 56 percent of respondents say they require data analysis within minutes of receiving the data, and 16 percent require analysis within seconds. The world has certainly evolved from the daily or weekly business intelligence report to a live dashboard, or even analysis that drives automated actions like website personalization that can have a direct impact on a business' effectiveness in engaging with its customers. These requirements put extreme pressure on enterprise data architectures not necessarily designed to deliver consumption-ready data with this type of speed.

Our survey responses indicate that enterprises funnel their data into a range of destinations, making them much more complicated and expensive to manage than ever before. In addition, respondents keep some of their data on premises (58 percent), some in private clouds (48 percent) and some in public clouds (27 percent). The combination of diverse data stores and multiple deployment models is a new phenomenon we call data sprawl, and it is a key driver of dataflow complexity.

The challenges of increased dataflow complexity are here and now and, given the unprecedented growth of data each day, must be considered the new normal. With this information as a bird's-eye view of the state of data in motion, savvy enterprises will adopt technologies and solutions that will help them evolve with the big data landscape.

Pat Patterson is Community Champion at StreamSets.

Hot Topics

The Latest

Businesses that face downtime or outages risk financial and reputational damage, as well as reducing partner, shareholder, and customer trust. One of the major challenges that enterprises face is implementing a robust business continuity plan. What's the solution? The answer may lie in disaster recovery tactics such as truly immutable storage and regular disaster recovery testing ...

IT spending is expected to jump nearly 10% in 2025, and organizations are now facing pressure to manage costs without slowing down critical functions like observability. To meet the challenge, leaders are turning to smarter, more cost effective business strategies. Enter stage right: OpenTelemetry, the missing piece of the puzzle that is no longer just an option but rather a strategic advantage ...

Amidst the threat of cyberhacks and data breaches, companies install several security measures to keep their business safely afloat. These measures aim to protect businesses, employees, and crucial data. Yet, employees perceive them as burdensome. Frustrated with complex logins, slow access, and constant security checks, workers decide to completely bypass all security set-ups ...

Image
Cloudbrink's Personal SASE services provide last-mile acceleration and reduction in latency

In MEAN TIME TO INSIGHT Episode 13, Shamus McGillicuddy, VP of Research, Network Infrastructure and Operations, at EMA discusses hybrid multi-cloud networking strategy ... 

In high-traffic environments, the sheer volume and unpredictable nature of network incidents can quickly overwhelm even the most skilled teams, hindering their ability to react swiftly and effectively, potentially impacting service availability and overall business performance. This is where closed-loop remediation comes into the picture: an IT management concept designed to address the escalating complexity of modern networks ...

In 2025, enterprise workflows are undergoing a seismic shift. Propelled by breakthroughs in generative AI (GenAI), large language models (LLMs), and natural language processing (NLP), a new paradigm is emerging — agentic AI. This technology is not just automating tasks; it's reimagining how organizations make decisions, engage customers, and operate at scale ...

In the early days of the cloud revolution, business leaders perceived cloud services as a means of sidelining IT organizations. IT was too slow, too expensive, or incapable of supporting new technologies. With a team of developers, line of business managers could deploy new applications and services in the cloud. IT has been fighting to retake control ever since. Today, IT is back in the driver's seat, according to new research by Enterprise Management Associates (EMA) ...

In today's fast-paced and increasingly complex network environments, Network Operations Centers (NOCs) are the backbone of ensuring continuous uptime, smooth service delivery, and rapid issue resolution. However, the challenges faced by NOC teams are only growing. In a recent study, 78% state network complexity has grown significantly over the last few years while 84% regularly learn about network issues from users. It is imperative we adopt a new approach to managing today's network experiences ...

Image
Broadcom

From growing reliance on FinOps teams to the increasing attention on artificial intelligence (AI), and software licensing, the Flexera 2025 State of the Cloud Report digs into how organizations are improving cloud spend efficiency, while tackling the complexities of emerging technologies ...

Today, organizations are generating and processing more data than ever before. From training AI models to running complex analytics, massive datasets have become the backbone of innovation. However, as businesses embrace the cloud for its scalability and flexibility, a new challenge arises: managing the soaring costs of storing and processing this data ...