Skip to main content

Simplifying Enterprise Software Usage – The Key to Reducing Waste and Increasing Efficiency

Tim Flower

Software and apps exist in the workplace to enable employees to get their jobs done, and they should inherently make their day-to-day easier. However, there is a fine line between helpful applications and having too much software that feels burdensome to employees and finding that limit falls on IT leaders. Reducing IT budgets is a priority for most teams as fears of a recession continue, and a simple, yet vastly overlooked method to reduce spend and redundancies is becoming more vigilant of your actual software license usage.


Recent research suggests that many organizations are paying for more software than they need. If organizations are looking to reduce IT spend, leaders should take a closer look at the tools being offered to employees, as not all software is essential.

Half of Purchased Software Goes Unused by Employees

50% of software licenses go unused by employees. This is a shocking realization for the leaders who pay for these tools to improve efficiency and overall quality of work.

Generally, the most unused licenses fall under Business Information and Integrated Development Environment categories, which include apps like Tableau and PyCharm. These platforms are most often seen as redundancies of tools from juggernaut providers like Microsoft or Slack, or they serve such a niche purpose that often do not see the use that justifies the cost.

How did this happen, and why isn't employee application usage more obvious to IT teams?

A staggering 5% of IT leaders claim to have "complete visibility" of the total number of software licenses being used by their employees. Lack of visibility from leadership has created an environment of increased waste and confusion among employees on the tools they should be using and when, and it prevents IT teams from asking the right questions to understand what is going on. For example, 37% of employees use three browser applications to access their SaaS tools and the internet.

What benefit does this insight offer IT Pros? It allows them to think more about why — are there certain browsers that work best with specific SaaS tools?

Should IT teams be recommending one browser for all employees?

Do employees even know why they have all these browsers together?

These are all important questions that IT teams wouldn't think to ask without software insight which can drive long-term value and workflow efficiency. Job number one of a software delivery team should be provisioning, and job number two should be reclamation.

Staying Vigilant is Still Key to Preventing Software Waste

As vigilance is the linchpin to reducing software waste, there are several steps IT teams must take to ensure they are seeing everything they should. Step one is conducting audits of software usage, ideally before negotiating agreements with vendors. Leaders will be able to find key usage attributes for the applications deployed in their enterprise, primarily the licenses that go unused, those that are used very little, and those used regularly.

While vendors attempt to sell a full suite of tools, it is important for IT to enter into negotiations with the knowledge of which licenses are or aren't bringing value to ensure they only pay for what is truly needed. The only way to do so is by asking the right questions.

While audits show IT leaders what they are missing visually, they usually focus only on provisioning vs purchased licenses to find the gap. However, internal routine analysis of usage will allow for teams to elaborate on that knowledge and put it to use in their organizations. While one-time data in an audit is helpful, as software licenses are updated and employees come and go from the company, the apps that are used and are providing value can always change depending on user preferences, business needs, and new options that become available. IT leaders must foster open communication with the business and employees they support to know what works best for them — there is no one-size-fits-all approach to building a software mix.

Scaling software usage for an organization requires a level of persona building — creating profile for specific job functions and departments to understand the employees that need specific software. And that persona should be dynamic, changing with user and business habits and needs.

For example, some in the marketing team may need Trello (one of the most underused applications in the recent survey) to stay on track with deliverables, while others in marketing may never need it. Relying purely on numbers would indicate to an IT team that because Trello is used in marketing, that they all need it. But in analyzing usage and in conversations with employees, they would realize that a specific segment of the organization desperately needs it while others can be removed. Rather than paying for company-wide licenses or canceling licenses completely, the IT team is now able to scale purchasing based on the area of greatest need while still driving cost savings. In leveraging both qualitative and quantitative data from employees, leaders can build the right package of software for each team member, while also reducing waste.

What Next?

While software waste should be a primary concern for IT leaders, it is an opportunity for cost optimization. As a recession looms and the C-Suite possibly considers difficult decisions to reduce spending, IT teams can offer data on a key area of waste that can potentially save organizations millions. With increased vigilance and a more personalized touch, businesses will be able to have a full understanding of how their employees prefer to work, where there is room for improvement, and where costs can be more optimized.

Through a more hands-on approach, leaders can reduce both costs and confusion alike as they create a more streamlined and synergistic experience. However, the decision to act on this ultimately falls on IT leaders and C-Suite members to look within and find where they can reduce waste.

The Latest

80% of respondents agree that the IT role is shifting from operators to orchestrators, according to the 2026 IT Trends Report: The Human Side of Autonomous IT from SolarWinds ...

40% of organizations deploying AI will implement dedicated AI observability tools by 2028 to monitor model performance, bias and outputs, according to Gartner ...

Until AI-powered engineering tools have live visibility of how code behaves at runtime, they cannot be trusted to autonomously ensure reliable systems, according to the State of AI-Powered Engineering Report 2026 report from Lightrun. The report reveals that a major volume of manual work is required when AI-generated code is deployed: 43% of AI-generated code requires manual debugging in production, even after passing QA or staging tests. Furthermore, an average of three manual redeploy cycles are required to verify a single AI-suggested code fix in production ...

Many organizations describe AI as strategic, but they do not manage it strategically. When AI plans are disconnected from strategy, detached from organizational learning, and protected from serious assumptions testing, the problem is no longer technical immaturity; it is a failure of management discipline ... Executives too often tell organizations to "use AI" before they define what AI is supposed to change. The problem deepens in organizations where strategy isn't well articulated in the first place ...

Across the enterprise technology landscape, a quiet crisis is playing out. Organizations have run hundreds, sometimes thousands, of generative AI pilots. Leadership has celebrated the proof of concept (POCs) ... Industry experience points to a sobering reality: only 5-10% of AI POCs that progress to the pilot stage successfully reach scaled production. The remaining 90% fail because the enterprise environment around them was never ready to absorb them, not the AI models ...

Today's modern systems are not what they once were. Organizations now rely on distributed systems, event-driven workflows, hybrid and multi-cloud environments and continuous delivery pipelines. While each adds flexibility, it also introduces new, often invisible failures. Development speed is no longer the primary bottleneck of innovation. Reliability is ...

Seeing is believing, or in this case, seeing is understanding, according to New Relic's 2025 Observability Forecast for Retail and eCommerce report. Retailers who want to provide exceptional customer experiences while improving IT operations efficiency are leaning on observability ... Here are five key takeaways from the report ...

Technology leaders across the federal landscape are facing, and will continue to face, an uphill battle when it comes to fortifying their digital environments against hostile and persistent threat actors. On one hand, they are being asked to push digital transformation ... On the other hand, they are facing the fiscal uncertainty of continuing resolutions (CR) and government shutdowns looming near and far. In the face of these challenges, CIOs, CTOs, and CISOs must figure out how to modernize legacy systems and infrastructure while doing more with less and still defending against external and internal threats ...

Reliability is no longer proven by uptime alone, according to the The SRE Report 2026 from LogicMonitor. In the AI era, it is experienced through speed, consistency, and user trust, and increasingly judged by business impact. As digital services grow more complex and AI systems move into production, traditional monitoring approaches are struggling to keep pace, increasing the need for AI-first observability that spans applications, infrastructure, and the Internet ...

If AI is the engine of a modern organization, then data engineering is the road system beneath it. You can build the most powerful engine in the world, but without paved roads, traffic signals, and bridges that can support its weight, it will stall. In many enterprises, the engine is ready. The roads are not ...

Simplifying Enterprise Software Usage – The Key to Reducing Waste and Increasing Efficiency

Tim Flower

Software and apps exist in the workplace to enable employees to get their jobs done, and they should inherently make their day-to-day easier. However, there is a fine line between helpful applications and having too much software that feels burdensome to employees and finding that limit falls on IT leaders. Reducing IT budgets is a priority for most teams as fears of a recession continue, and a simple, yet vastly overlooked method to reduce spend and redundancies is becoming more vigilant of your actual software license usage.


Recent research suggests that many organizations are paying for more software than they need. If organizations are looking to reduce IT spend, leaders should take a closer look at the tools being offered to employees, as not all software is essential.

Half of Purchased Software Goes Unused by Employees

50% of software licenses go unused by employees. This is a shocking realization for the leaders who pay for these tools to improve efficiency and overall quality of work.

Generally, the most unused licenses fall under Business Information and Integrated Development Environment categories, which include apps like Tableau and PyCharm. These platforms are most often seen as redundancies of tools from juggernaut providers like Microsoft or Slack, or they serve such a niche purpose that often do not see the use that justifies the cost.

How did this happen, and why isn't employee application usage more obvious to IT teams?

A staggering 5% of IT leaders claim to have "complete visibility" of the total number of software licenses being used by their employees. Lack of visibility from leadership has created an environment of increased waste and confusion among employees on the tools they should be using and when, and it prevents IT teams from asking the right questions to understand what is going on. For example, 37% of employees use three browser applications to access their SaaS tools and the internet.

What benefit does this insight offer IT Pros? It allows them to think more about why — are there certain browsers that work best with specific SaaS tools?

Should IT teams be recommending one browser for all employees?

Do employees even know why they have all these browsers together?

These are all important questions that IT teams wouldn't think to ask without software insight which can drive long-term value and workflow efficiency. Job number one of a software delivery team should be provisioning, and job number two should be reclamation.

Staying Vigilant is Still Key to Preventing Software Waste

As vigilance is the linchpin to reducing software waste, there are several steps IT teams must take to ensure they are seeing everything they should. Step one is conducting audits of software usage, ideally before negotiating agreements with vendors. Leaders will be able to find key usage attributes for the applications deployed in their enterprise, primarily the licenses that go unused, those that are used very little, and those used regularly.

While vendors attempt to sell a full suite of tools, it is important for IT to enter into negotiations with the knowledge of which licenses are or aren't bringing value to ensure they only pay for what is truly needed. The only way to do so is by asking the right questions.

While audits show IT leaders what they are missing visually, they usually focus only on provisioning vs purchased licenses to find the gap. However, internal routine analysis of usage will allow for teams to elaborate on that knowledge and put it to use in their organizations. While one-time data in an audit is helpful, as software licenses are updated and employees come and go from the company, the apps that are used and are providing value can always change depending on user preferences, business needs, and new options that become available. IT leaders must foster open communication with the business and employees they support to know what works best for them — there is no one-size-fits-all approach to building a software mix.

Scaling software usage for an organization requires a level of persona building — creating profile for specific job functions and departments to understand the employees that need specific software. And that persona should be dynamic, changing with user and business habits and needs.

For example, some in the marketing team may need Trello (one of the most underused applications in the recent survey) to stay on track with deliverables, while others in marketing may never need it. Relying purely on numbers would indicate to an IT team that because Trello is used in marketing, that they all need it. But in analyzing usage and in conversations with employees, they would realize that a specific segment of the organization desperately needs it while others can be removed. Rather than paying for company-wide licenses or canceling licenses completely, the IT team is now able to scale purchasing based on the area of greatest need while still driving cost savings. In leveraging both qualitative and quantitative data from employees, leaders can build the right package of software for each team member, while also reducing waste.

What Next?

While software waste should be a primary concern for IT leaders, it is an opportunity for cost optimization. As a recession looms and the C-Suite possibly considers difficult decisions to reduce spending, IT teams can offer data on a key area of waste that can potentially save organizations millions. With increased vigilance and a more personalized touch, businesses will be able to have a full understanding of how their employees prefer to work, where there is room for improvement, and where costs can be more optimized.

Through a more hands-on approach, leaders can reduce both costs and confusion alike as they create a more streamlined and synergistic experience. However, the decision to act on this ultimately falls on IT leaders and C-Suite members to look within and find where they can reduce waste.

The Latest

80% of respondents agree that the IT role is shifting from operators to orchestrators, according to the 2026 IT Trends Report: The Human Side of Autonomous IT from SolarWinds ...

40% of organizations deploying AI will implement dedicated AI observability tools by 2028 to monitor model performance, bias and outputs, according to Gartner ...

Until AI-powered engineering tools have live visibility of how code behaves at runtime, they cannot be trusted to autonomously ensure reliable systems, according to the State of AI-Powered Engineering Report 2026 report from Lightrun. The report reveals that a major volume of manual work is required when AI-generated code is deployed: 43% of AI-generated code requires manual debugging in production, even after passing QA or staging tests. Furthermore, an average of three manual redeploy cycles are required to verify a single AI-suggested code fix in production ...

Many organizations describe AI as strategic, but they do not manage it strategically. When AI plans are disconnected from strategy, detached from organizational learning, and protected from serious assumptions testing, the problem is no longer technical immaturity; it is a failure of management discipline ... Executives too often tell organizations to "use AI" before they define what AI is supposed to change. The problem deepens in organizations where strategy isn't well articulated in the first place ...

Across the enterprise technology landscape, a quiet crisis is playing out. Organizations have run hundreds, sometimes thousands, of generative AI pilots. Leadership has celebrated the proof of concept (POCs) ... Industry experience points to a sobering reality: only 5-10% of AI POCs that progress to the pilot stage successfully reach scaled production. The remaining 90% fail because the enterprise environment around them was never ready to absorb them, not the AI models ...

Today's modern systems are not what they once were. Organizations now rely on distributed systems, event-driven workflows, hybrid and multi-cloud environments and continuous delivery pipelines. While each adds flexibility, it also introduces new, often invisible failures. Development speed is no longer the primary bottleneck of innovation. Reliability is ...

Seeing is believing, or in this case, seeing is understanding, according to New Relic's 2025 Observability Forecast for Retail and eCommerce report. Retailers who want to provide exceptional customer experiences while improving IT operations efficiency are leaning on observability ... Here are five key takeaways from the report ...

Technology leaders across the federal landscape are facing, and will continue to face, an uphill battle when it comes to fortifying their digital environments against hostile and persistent threat actors. On one hand, they are being asked to push digital transformation ... On the other hand, they are facing the fiscal uncertainty of continuing resolutions (CR) and government shutdowns looming near and far. In the face of these challenges, CIOs, CTOs, and CISOs must figure out how to modernize legacy systems and infrastructure while doing more with less and still defending against external and internal threats ...

Reliability is no longer proven by uptime alone, according to the The SRE Report 2026 from LogicMonitor. In the AI era, it is experienced through speed, consistency, and user trust, and increasingly judged by business impact. As digital services grow more complex and AI systems move into production, traditional monitoring approaches are struggling to keep pace, increasing the need for AI-first observability that spans applications, infrastructure, and the Internet ...

If AI is the engine of a modern organization, then data engineering is the road system beneath it. You can build the most powerful engine in the world, but without paved roads, traffic signals, and bridges that can support its weight, it will stall. In many enterprises, the engine is ready. The roads are not ...