Skip to main content

An Interview with Neebula Co-Founder - Part One

Pete Goldin
Editor and Publisher
APMdigest

In Part One of BSMdigest’s exclusive interview, Ariel Gordon, Neebula VP of Products and Co-Founder, and BSM pioneer, talks about the concept, history and evolution of Business Service Management.

BSM: To set the stage for this interview, first tell me about your history with BMC.

AG: I joined BMC through the acquisition of New Dimension in 1999. I was the CTO of New Dimension at that time and so I became a member of BMC’s new CTO team. At the time BMC was unable to sell at the CIO level and we had funny ads in the papers: “We are BMC … We are the silent giant from Wall street, you don’t know us Mr. CIO but we run your IT”. To help resolve it, the new CTO team offered BSM as the strategy that would integrate our management solutions around this theme and enable BMC to sell at the CIO level.

I was given the task to work with Business Development to design the solution. We came back with the architecture and roadmap of the needed solution. The solution was centered on an “Object Store” which included a record of all the IT assets and a service model that mapped the infrastructure to the business services. The strategy also included the acquisition of an ITSM product suite and a BSM aware event management product. This “Object Store” was later renamed by us as “Atrium CMDB”.

After BMC completed the first phase of the project, the second phase was to make all the BMC products integrate to the CMDB and between themselves to create a BSM solution. By then I became BMC’s CTO, and in that role worked with key BMC architects and teams on the design, creation and development of the Atrium integration technology within BMC. I left BMC at the end of 2007 to return to Israel.

BSM: What was the initial driver behind the creation of Business Service Management?

AG: At the end of the 90s it became clear that the concept of Frameworks was beginning to fail. Services started to span across many servers (and applications) and silo management could not provide the availability and quality needed and it was clear that you needed to look at services as a whole and not only at the individual components.

As usual, customers and in particular leading edge banks and Telco’s, were in the frontline and created their own solutions. I remember visiting four such banks to see private solutions they had created. In addition, software companies started to note this and a number of startups were trying to fill the void, Sysstar, Proxima, Managed Objects, Interlinks, Appilog, to name a few. At this time it was called Service Management – the context of Business (BSM) was added later. At that time, BMC was looking for a way to become a leading vendor in the eye of the CIO and using this new trend looked like an opportunity that could not be missed.

BSM: I heard that BMC coined the term "BSM" – is this true?

AG: When we decided to adopt the term BSM inside BMC, we were looking for a catchy term for our strategy that we can turn into an industry trend that would replace and “wash away” the framework concept of our competitors. BMC decided not to register BSM as a trademark with the hope that many other organizations would adopt the same term. So even though I think BMC was the first to coin the term, I am afraid we will never know this for sure. One thing is clear: that BMC’s decision to make BSM its strategy, not to register it as a trademark, and push the concept with all its marketing dollars in order to make BSM a major industry term, drove BSM to the mainstream of IT. This really made BSM a significant trend that was adopted by almost all the players in the industry.

BSM: Is there a big difference between BMC's initial BSM strategy and the strategy today?

AG: A lot of the concepts of the original BSM strategy are still in place but even in 2007 BMC was more focused on the ITSM side of the solution than the availability management side. And today our market and industry are very different than the one we had in 2001. Back then, virtualization existed only on the IBM mainframe, and consolidation and cloud computing was not even a dream. BSM was designed in an era when data centers were static and was definitely not built to support these new dynamic environments. So BMC and other vendors like HP are adding new cloud management components to the strategy, and are trying to strengthen their current BSM solutions to cope with this paradigm shift, however this effort is huge and there is still a big gap between what they provide and what customers are in need of. To fill this void there are new startups like Neebula.

BSM: How have BSM tool capabilities evolved over the years?

AG: The BSM idea was how to loosely integrate all of the IT management tools in order to enable the management of IT from the perspective of the services that IT provides to the business. The integration was centered on a CMDB and a service configuration model that describes the services. This model is then used by all the BSM-enabled products.

Initially in 2001-2002 this was a very novel concept without any real technology behind it. Only in 2004-2005 with the release of BMC’s Atrium, HP’s BSM solution and other vendors’ solutions, did BSM really take form and have integrated solutions that customers could use.

Today, the BSM solutions out there are much more integrated and mature but the building and maintaining of the service configuration model is still an issue, as it is so costly. That is one of the major reasons for BSM implementation failures.

BSM: What is the biggest challenge with BSM today?

AG: The biggest issue is the changing paradigm of IT infrastructure e.g. the move to virtualization, private clouds, and public clouds with all its variants, IaaS, PaaS, SaaS. We are not yet seeing the effect of these changes on IT shops. This will take years as organizations morph themselves to benefit from the new capabilities. And these changes will be radical – internal IT infrastructure and the teams that manage them will be eliminated.

The BSM concepts were designed 10 years ago and the solution – because of the modeling issue – is hard and lengthy to implement, slow to react to change, and by nature not agile. That is exactly the opposite of what is needed today, and so these tools are now becoming one of the shackles that are blocking the move to the new IT infrastructure. I am afraid that unless a new breed of BSM tools is created, BSM as we know it today will disappear.

BSM: What do you consider “true BSM”?

AG: A true BSM solution is any solution that allows organizations to manage their IT infrastructure and their applications from the quality of the service that it provides to the business. At the heart of such a solution is a map or a model of each service that allows the organization to understand all the components that are needed in order to provide this service. A true BSM solution has a set of tools that use this mapping to provide comprehensive management of the business services, including their performance and quality of the service.

Click here to read Part Two of the BSMdigest interview with Ariel Gordon, VP of Products and Co-Founder of Neebula.

Hot Topic
The Latest
The Latest 10

The Latest

For many B2B and B2C enterprise brands, technology isn't a core strength. Relying on overly complex architectures (like those that follow a pure MACH doctrine) has been flagged by industry leaders as a source of operational slowdown, creating bottlenecks that limit agility in volatile market conditions ...

FinOps champions crucial cross-departmental collaboration, uniting business, finance, technology and engineering leaders to demystify cloud expenses. Yet, too often, critical cost issues are softened into mere "recommendations" or "insights" — easy to ignore. But what if we adopted security's battle-tested strategy and reframed these as the urgent risks they truly are, demanding immediate action? ...

Two in three IT professionals now cite growing complexity as their top challenge — an urgent signal that the modernization curve may be getting too steep, according to the Rising to the Challenge survey from Checkmk ...

While IT leaders are becoming more comfortable and adept at balancing workloads across on-premises, colocation data centers and the public cloud, there's a key component missing: connectivity, according to the 2025 State of the Data Center Report from CoreSite ...

A perfect storm is brewing in cybersecurity — certificate lifespans shrinking to just 47 days while quantum computing threatens today's encryption. Organizations must embrace ephemeral trust and crypto-agility to survive this dual challenge ...

In MEAN TIME TO INSIGHT Episode 14, Shamus McGillicuddy, VP of Research, Network Infrastructure and Operations, at EMA discusses hybrid multi-cloud network observability... 

While companies adopt AI at a record pace, they also face the challenge of finding a smart and scalable way to manage its rapidly growing costs. This requires balancing the massive possibilities inherent in AI with the need to control cloud costs, aim for long-term profitability and optimize spending ...

Telecommunications is expanding at an unprecedented pace ... But progress brings complexity. As WanAware's 2025 Telecom Observability Benchmark Report reveals, many operators are discovering that modernization requires more than physical build outs and CapEx — it also demands the tools and insights to manage, secure, and optimize this fast-growing infrastructure in real time ...

As businesses increasingly rely on high-performance applications to deliver seamless user experiences, the demand for fast, reliable, and scalable data storage systems has never been greater. Redis — an open-source, in-memory data structure store — has emerged as a popular choice for use cases ranging from caching to real-time analytics. But with great performance comes the need for vigilant monitoring ...

Kubernetes was not initially designed with AI's vast resource variability in mind, and the rapid rise of AI has exposed Kubernetes limitations, particularly when it comes to cost and resource efficiency. Indeed, AI workloads differ from traditional applications in that they require a staggering amount and variety of compute resources, and their consumption is far less consistent than traditional workloads ... Considering the speed of AI innovation, teams cannot afford to be bogged down by these constant infrastructure concerns. A solution is needed ...

An Interview with Neebula Co-Founder - Part One

Pete Goldin
Editor and Publisher
APMdigest

In Part One of BSMdigest’s exclusive interview, Ariel Gordon, Neebula VP of Products and Co-Founder, and BSM pioneer, talks about the concept, history and evolution of Business Service Management.

BSM: To set the stage for this interview, first tell me about your history with BMC.

AG: I joined BMC through the acquisition of New Dimension in 1999. I was the CTO of New Dimension at that time and so I became a member of BMC’s new CTO team. At the time BMC was unable to sell at the CIO level and we had funny ads in the papers: “We are BMC … We are the silent giant from Wall street, you don’t know us Mr. CIO but we run your IT”. To help resolve it, the new CTO team offered BSM as the strategy that would integrate our management solutions around this theme and enable BMC to sell at the CIO level.

I was given the task to work with Business Development to design the solution. We came back with the architecture and roadmap of the needed solution. The solution was centered on an “Object Store” which included a record of all the IT assets and a service model that mapped the infrastructure to the business services. The strategy also included the acquisition of an ITSM product suite and a BSM aware event management product. This “Object Store” was later renamed by us as “Atrium CMDB”.

After BMC completed the first phase of the project, the second phase was to make all the BMC products integrate to the CMDB and between themselves to create a BSM solution. By then I became BMC’s CTO, and in that role worked with key BMC architects and teams on the design, creation and development of the Atrium integration technology within BMC. I left BMC at the end of 2007 to return to Israel.

BSM: What was the initial driver behind the creation of Business Service Management?

AG: At the end of the 90s it became clear that the concept of Frameworks was beginning to fail. Services started to span across many servers (and applications) and silo management could not provide the availability and quality needed and it was clear that you needed to look at services as a whole and not only at the individual components.

As usual, customers and in particular leading edge banks and Telco’s, were in the frontline and created their own solutions. I remember visiting four such banks to see private solutions they had created. In addition, software companies started to note this and a number of startups were trying to fill the void, Sysstar, Proxima, Managed Objects, Interlinks, Appilog, to name a few. At this time it was called Service Management – the context of Business (BSM) was added later. At that time, BMC was looking for a way to become a leading vendor in the eye of the CIO and using this new trend looked like an opportunity that could not be missed.

BSM: I heard that BMC coined the term "BSM" – is this true?

AG: When we decided to adopt the term BSM inside BMC, we were looking for a catchy term for our strategy that we can turn into an industry trend that would replace and “wash away” the framework concept of our competitors. BMC decided not to register BSM as a trademark with the hope that many other organizations would adopt the same term. So even though I think BMC was the first to coin the term, I am afraid we will never know this for sure. One thing is clear: that BMC’s decision to make BSM its strategy, not to register it as a trademark, and push the concept with all its marketing dollars in order to make BSM a major industry term, drove BSM to the mainstream of IT. This really made BSM a significant trend that was adopted by almost all the players in the industry.

BSM: Is there a big difference between BMC's initial BSM strategy and the strategy today?

AG: A lot of the concepts of the original BSM strategy are still in place but even in 2007 BMC was more focused on the ITSM side of the solution than the availability management side. And today our market and industry are very different than the one we had in 2001. Back then, virtualization existed only on the IBM mainframe, and consolidation and cloud computing was not even a dream. BSM was designed in an era when data centers were static and was definitely not built to support these new dynamic environments. So BMC and other vendors like HP are adding new cloud management components to the strategy, and are trying to strengthen their current BSM solutions to cope with this paradigm shift, however this effort is huge and there is still a big gap between what they provide and what customers are in need of. To fill this void there are new startups like Neebula.

BSM: How have BSM tool capabilities evolved over the years?

AG: The BSM idea was how to loosely integrate all of the IT management tools in order to enable the management of IT from the perspective of the services that IT provides to the business. The integration was centered on a CMDB and a service configuration model that describes the services. This model is then used by all the BSM-enabled products.

Initially in 2001-2002 this was a very novel concept without any real technology behind it. Only in 2004-2005 with the release of BMC’s Atrium, HP’s BSM solution and other vendors’ solutions, did BSM really take form and have integrated solutions that customers could use.

Today, the BSM solutions out there are much more integrated and mature but the building and maintaining of the service configuration model is still an issue, as it is so costly. That is one of the major reasons for BSM implementation failures.

BSM: What is the biggest challenge with BSM today?

AG: The biggest issue is the changing paradigm of IT infrastructure e.g. the move to virtualization, private clouds, and public clouds with all its variants, IaaS, PaaS, SaaS. We are not yet seeing the effect of these changes on IT shops. This will take years as organizations morph themselves to benefit from the new capabilities. And these changes will be radical – internal IT infrastructure and the teams that manage them will be eliminated.

The BSM concepts were designed 10 years ago and the solution – because of the modeling issue – is hard and lengthy to implement, slow to react to change, and by nature not agile. That is exactly the opposite of what is needed today, and so these tools are now becoming one of the shackles that are blocking the move to the new IT infrastructure. I am afraid that unless a new breed of BSM tools is created, BSM as we know it today will disappear.

BSM: What do you consider “true BSM”?

AG: A true BSM solution is any solution that allows organizations to manage their IT infrastructure and their applications from the quality of the service that it provides to the business. At the heart of such a solution is a map or a model of each service that allows the organization to understand all the components that are needed in order to provide this service. A true BSM solution has a set of tools that use this mapping to provide comprehensive management of the business services, including their performance and quality of the service.

Click here to read Part Two of the BSMdigest interview with Ariel Gordon, VP of Products and Co-Founder of Neebula.

Hot Topic
The Latest
The Latest 10

The Latest

For many B2B and B2C enterprise brands, technology isn't a core strength. Relying on overly complex architectures (like those that follow a pure MACH doctrine) has been flagged by industry leaders as a source of operational slowdown, creating bottlenecks that limit agility in volatile market conditions ...

FinOps champions crucial cross-departmental collaboration, uniting business, finance, technology and engineering leaders to demystify cloud expenses. Yet, too often, critical cost issues are softened into mere "recommendations" or "insights" — easy to ignore. But what if we adopted security's battle-tested strategy and reframed these as the urgent risks they truly are, demanding immediate action? ...

Two in three IT professionals now cite growing complexity as their top challenge — an urgent signal that the modernization curve may be getting too steep, according to the Rising to the Challenge survey from Checkmk ...

While IT leaders are becoming more comfortable and adept at balancing workloads across on-premises, colocation data centers and the public cloud, there's a key component missing: connectivity, according to the 2025 State of the Data Center Report from CoreSite ...

A perfect storm is brewing in cybersecurity — certificate lifespans shrinking to just 47 days while quantum computing threatens today's encryption. Organizations must embrace ephemeral trust and crypto-agility to survive this dual challenge ...

In MEAN TIME TO INSIGHT Episode 14, Shamus McGillicuddy, VP of Research, Network Infrastructure and Operations, at EMA discusses hybrid multi-cloud network observability... 

While companies adopt AI at a record pace, they also face the challenge of finding a smart and scalable way to manage its rapidly growing costs. This requires balancing the massive possibilities inherent in AI with the need to control cloud costs, aim for long-term profitability and optimize spending ...

Telecommunications is expanding at an unprecedented pace ... But progress brings complexity. As WanAware's 2025 Telecom Observability Benchmark Report reveals, many operators are discovering that modernization requires more than physical build outs and CapEx — it also demands the tools and insights to manage, secure, and optimize this fast-growing infrastructure in real time ...

As businesses increasingly rely on high-performance applications to deliver seamless user experiences, the demand for fast, reliable, and scalable data storage systems has never been greater. Redis — an open-source, in-memory data structure store — has emerged as a popular choice for use cases ranging from caching to real-time analytics. But with great performance comes the need for vigilant monitoring ...

Kubernetes was not initially designed with AI's vast resource variability in mind, and the rapid rise of AI has exposed Kubernetes limitations, particularly when it comes to cost and resource efficiency. Indeed, AI workloads differ from traditional applications in that they require a staggering amount and variety of compute resources, and their consumption is far less consistent than traditional workloads ... Considering the speed of AI innovation, teams cannot afford to be bogged down by these constant infrastructure concerns. A solution is needed ...