Skip to main content

Unravel Data Partners with Databricks for Lakehouse Observability and FinOps

Unravel Data has joined the Databricks Partner Program to deliver AI-powered data observability into Databricks for granular visibility, performance optimizations, and cost governance of data pipelines and applications.

With this new partnership, Unravel and Databricks will collaborate on Go-To-Market (GTM) efforts to enable Databricks customers to leverage Unravel’s purpose-built AI for the Lakehouse for real-time, continuous insights and recommendations to speed time to value of data and AI products and ensure optimal ROI.

Unravel’s purpose-built AI for Databricks integrates with Lakehouse Monitoring and Lakehouse Observability to deliver performance and efficiency needed to achieve speed and scale for data analytics and AI products. Unravel’s integration with Unity Catalog enables Databricks users to speed up lakehouse transformation by providing real-time, AI-powered cost insights, code-level optimizations, accurate spending predictions, and performance recommendations to accelerate data pipelines and applications for greater returns on cloud data platform investments. Auto Actions and alerts help automate governance with proactive guardrails.

“Most organizations today are receiving unprecedented amounts of data from a staggering number of sources, and they’re struggling to manage it all, which can quickly lead to unpredictable cloud data spend. This combination of rapid lakehouse adoption and the hyperfocus companies have on leveraging AI/ML models for additional revenue and competitive advantage, brings the importance of data observability to the forefront,” said Kunal Agarwal, CEO and co-founder, Unravel Data. “Lakehouse customers who use Unravel can now achieve the agility required for AI/ML innovation while having the predictability and cost governance guardrails needed to ensure a strong ROI.”

Unravel’s purpose-built AI for Databricks delivers insights based on Unravel’s deep observability at the job, user, and code level to supply AI-driven cost efficiency recommendations, including compute provisioning, query performance, autoscaling efficiencies, and more.

Unravel for Databricks enables organizations to:

- Speed cloud transformation initiatives by having real-time cost visibility, predictive spend forecasting, and performance insights for their workloads

- Enhance time to market of new AI initiatives by mitigating potential pipeline bottlenecks and associated costs before they occur

- Better manage and optimize the ROI of data projects with customized dashboards and alerts that offer insights on spend, performance, and unit economics

Unravel’s integration with popular DevOps tools like GitHub and Azure DevOps provides actionability in CI/CD workflows by enabling early issue detection during the code-merge phase and providing developers real-time insights into potential financial impacts of their code changes. This results in fewer production issues and improved cost efficiency.

The Latest

A new study by the IBM Institute for Business Value reveals that enterprises are expected to significantly scale AI-enabled workflows, many driven by agentic AI, relying on them for improved decision making and automation. The AI Projects to Profits study revealed that respondents expect AI-enabled workflows to grow from 3% today to 25% by the end of 2025. With 70% of surveyed executives indicating that agentic AI is important to their organization's future, the research suggests that many organizations are actively encouraging experimentation ...

Respondents predict that agentic AI will play an increasingly prominent role in their interactions with technology vendors over the coming years and are positive about the benefits it will bring, according to The Race to an Agentic Future: How Agentic AI Will Transform Customer Experience, a report from Cisco ...

A new wave of tariffs, some exceeding 100%, is sending shockwaves across the technology industry. Enterprises are grappling with sudden, dramatic cost increases that threaten to disrupt carefully planned budgets, sourcing strategies, and deployment plans. For CIOs and CTOs, this isn't just an economic setback; it's a wake-up call. The era of predictable cloud pricing and stable global supply chains is over ...

As artificial intelligence (AI) adoption gains momentum, network readiness is emerging as a critical success factor. AI workloads generate unpredictable bursts of traffic, demanding high-speed connectivity that is low latency and lossless. AI adoption will require upgrades and optimizations in data center networks and wide-area networks (WANs). This is prompting enterprise IT teams to rethink, re-architect, and upgrade their data center and WANs to support AI-driven operations ...

Artificial intelligence (AI) is core to observability practices, with some 41% of respondents reporting AI adoption as a core driver of observability, according to the State of Observability for Financial Services and Insurance report from New Relic ...

Application performance monitoring (APM) is a game of catching up — building dashboards, setting thresholds, tuning alerts, and manually correlating metrics to root causes. In the early days, this straightforward model worked as applications were simpler, stacks more predictable, and telemetry was manageable. Today, the landscape has shifted, and more assertive tools are needed ...

Cloud adoption has accelerated, but backup strategies haven't always kept pace. Many organizations continue to rely on backup strategies that were either lifted directly from on-prem environments or use cloud-native tools in limited, DR-focused ways ... Eon uncovered a handful of critical gaps regarding how organizations approach cloud backup. To capture these prevailing winds, we gathered insights from 150+ IT and cloud leaders at the recent Google Cloud Next conference, which we've compiled into the 2025 State of Cloud Data Backup ...

Private clouds are no longer playing catch-up, and public clouds are no longer the default as organizations recalibrate their cloud strategies, according to the Private Cloud Outlook 2025 report from Broadcom. More than half (53%) of survey respondents say private cloud is their top priority for deploying new workloads over the next three years, while 69% are considering workload repatriation from public to private cloud, with one-third having already done so ...

As organizations chase productivity gains from generative AI, teams are overwhelmingly focused on improving delivery speed (45%) over enhancing software quality (13%), according to the Quality Transformation Report from Tricentis ...

Back in March of this year ... MongoDB's stock price took a serious tumble ... In my opinion, it reflects a deeper structural issue in enterprise software economics altogether — vendor lock-in ...

Unravel Data Partners with Databricks for Lakehouse Observability and FinOps

Unravel Data has joined the Databricks Partner Program to deliver AI-powered data observability into Databricks for granular visibility, performance optimizations, and cost governance of data pipelines and applications.

With this new partnership, Unravel and Databricks will collaborate on Go-To-Market (GTM) efforts to enable Databricks customers to leverage Unravel’s purpose-built AI for the Lakehouse for real-time, continuous insights and recommendations to speed time to value of data and AI products and ensure optimal ROI.

Unravel’s purpose-built AI for Databricks integrates with Lakehouse Monitoring and Lakehouse Observability to deliver performance and efficiency needed to achieve speed and scale for data analytics and AI products. Unravel’s integration with Unity Catalog enables Databricks users to speed up lakehouse transformation by providing real-time, AI-powered cost insights, code-level optimizations, accurate spending predictions, and performance recommendations to accelerate data pipelines and applications for greater returns on cloud data platform investments. Auto Actions and alerts help automate governance with proactive guardrails.

“Most organizations today are receiving unprecedented amounts of data from a staggering number of sources, and they’re struggling to manage it all, which can quickly lead to unpredictable cloud data spend. This combination of rapid lakehouse adoption and the hyperfocus companies have on leveraging AI/ML models for additional revenue and competitive advantage, brings the importance of data observability to the forefront,” said Kunal Agarwal, CEO and co-founder, Unravel Data. “Lakehouse customers who use Unravel can now achieve the agility required for AI/ML innovation while having the predictability and cost governance guardrails needed to ensure a strong ROI.”

Unravel’s purpose-built AI for Databricks delivers insights based on Unravel’s deep observability at the job, user, and code level to supply AI-driven cost efficiency recommendations, including compute provisioning, query performance, autoscaling efficiencies, and more.

Unravel for Databricks enables organizations to:

- Speed cloud transformation initiatives by having real-time cost visibility, predictive spend forecasting, and performance insights for their workloads

- Enhance time to market of new AI initiatives by mitigating potential pipeline bottlenecks and associated costs before they occur

- Better manage and optimize the ROI of data projects with customized dashboards and alerts that offer insights on spend, performance, and unit economics

Unravel’s integration with popular DevOps tools like GitHub and Azure DevOps provides actionability in CI/CD workflows by enabling early issue detection during the code-merge phase and providing developers real-time insights into potential financial impacts of their code changes. This results in fewer production issues and improved cost efficiency.

The Latest

A new study by the IBM Institute for Business Value reveals that enterprises are expected to significantly scale AI-enabled workflows, many driven by agentic AI, relying on them for improved decision making and automation. The AI Projects to Profits study revealed that respondents expect AI-enabled workflows to grow from 3% today to 25% by the end of 2025. With 70% of surveyed executives indicating that agentic AI is important to their organization's future, the research suggests that many organizations are actively encouraging experimentation ...

Respondents predict that agentic AI will play an increasingly prominent role in their interactions with technology vendors over the coming years and are positive about the benefits it will bring, according to The Race to an Agentic Future: How Agentic AI Will Transform Customer Experience, a report from Cisco ...

A new wave of tariffs, some exceeding 100%, is sending shockwaves across the technology industry. Enterprises are grappling with sudden, dramatic cost increases that threaten to disrupt carefully planned budgets, sourcing strategies, and deployment plans. For CIOs and CTOs, this isn't just an economic setback; it's a wake-up call. The era of predictable cloud pricing and stable global supply chains is over ...

As artificial intelligence (AI) adoption gains momentum, network readiness is emerging as a critical success factor. AI workloads generate unpredictable bursts of traffic, demanding high-speed connectivity that is low latency and lossless. AI adoption will require upgrades and optimizations in data center networks and wide-area networks (WANs). This is prompting enterprise IT teams to rethink, re-architect, and upgrade their data center and WANs to support AI-driven operations ...

Artificial intelligence (AI) is core to observability practices, with some 41% of respondents reporting AI adoption as a core driver of observability, according to the State of Observability for Financial Services and Insurance report from New Relic ...

Application performance monitoring (APM) is a game of catching up — building dashboards, setting thresholds, tuning alerts, and manually correlating metrics to root causes. In the early days, this straightforward model worked as applications were simpler, stacks more predictable, and telemetry was manageable. Today, the landscape has shifted, and more assertive tools are needed ...

Cloud adoption has accelerated, but backup strategies haven't always kept pace. Many organizations continue to rely on backup strategies that were either lifted directly from on-prem environments or use cloud-native tools in limited, DR-focused ways ... Eon uncovered a handful of critical gaps regarding how organizations approach cloud backup. To capture these prevailing winds, we gathered insights from 150+ IT and cloud leaders at the recent Google Cloud Next conference, which we've compiled into the 2025 State of Cloud Data Backup ...

Private clouds are no longer playing catch-up, and public clouds are no longer the default as organizations recalibrate their cloud strategies, according to the Private Cloud Outlook 2025 report from Broadcom. More than half (53%) of survey respondents say private cloud is their top priority for deploying new workloads over the next three years, while 69% are considering workload repatriation from public to private cloud, with one-third having already done so ...

As organizations chase productivity gains from generative AI, teams are overwhelmingly focused on improving delivery speed (45%) over enhancing software quality (13%), according to the Quality Transformation Report from Tricentis ...

Back in March of this year ... MongoDB's stock price took a serious tumble ... In my opinion, it reflects a deeper structural issue in enterprise software economics altogether — vendor lock-in ...