Choosing the right IT management software is sometimes like looking for a needle in a haystack. There's so much to choose from, and it all seems to do the same thing and is claimed to be fantastic.
But things aren't always what they seem. In a world that's changing faster than ever, virtualization and commodity hardware make it extremely difficult for your organization to choose the right tools. To point you in the right direction, I have set out 6 basic rules below. I hope they'll be useful to you.
1. Start from the beginning
Don't assume that the tools you've used in the past will still work.
Many well-established companies complain that parties such as Google and Facebook innovate much faster, have fewer faults and are able to manage with fewer people and lower costs because they're not weighed down by legacy. It's true that having to drag along legacy systems costs time and money, but why should you be left to carry the burden? The same goes for IT management software. If you, as an organization, innovate with the applications, you also have to innovate in this area. Don't assume that the parties who were already around when you started still have the best solutions.
Challenge the dinosaurs.
2. Choose freemium, opt for self-installation and only test in production
There are a number of perceivable trends in IT management software:
■ It must be possible to try out software free of charge, even if the free version has limited features. Even with a limited feature set you can gain a clear impression of the software.
■ You have to be able to install the software yourself without calling in a professional services organization. This is the best way of judging whether the tools are easy to use and manage, and that is a crucial aspect. This hugely shortens ROI and lowers the TCO.
■ And this is actually the most important point: make sure that you test in production before buying. Nothing is worse than discovering that the tools work well in the acceptance environment but create so much overhead in production that they are unusable. Testing in production saves a lot of money and frustration!
3. Be prepared for virtualization
Virtualization is an unstoppable trend in organizations, and your software has to keep pace. There are many implications here. A lot of legacy software is unable to read the right counters or is simply incapable of dealing with environments that are upscaled or downscaled according to usage.
4. Performance = latency or response time, not the use of resources
The most important KPI in the toolset of today and the future is performance, but measured in terms of latency or response time. This should be measured from the end-user to the back end.
Performance used to be measured in terms of resource usage, such as CPU usage. But those days are behind us. In a virtualized environment it's very difficult to determine the effect of what are often inaccurate figures and what this says about the end-user. Probably nothing.
5. Be sure to have 100% cover, not 95%
The 80/20 rule doesn't apply here. The right tool has to cover the entire application landscape. It's important to map out every aspect of the chain, both horizontally and vertically. That doesn't mean that you have to measure everything all the time, but you do need to have access to the right information at the right times.
6. Data must be real time, predictable and complete
Fortunately most legacy tools are real time and complete, but by no means all of them are predictable.
"Real time" speaks for itself. Nothing is achieved if the required data isn't available until hours after the incident. Things move so fast these days that it only takes an hour before the whole country knows you've got a problem, which could harm your image.
"Complete" follows on seamlessly from this. The tool is not up to the job if it takes extra actions to get the information you need. Integrations between several tools are crucial in the software society. Correlating from several sources is vital to everyone's ability to make the right decisions.
"Predictable" is perhaps the most interesting aspect. It takes a lot of work to set up signals to alert you to incidents as soon as possible, and this is often based on settings that were agreed years ago, but who's to say that this is realistic? Who knows what constitutes normal behavior in a virtualized environment? Nobody, which is why it's of paramount importance that the tool you choose learns for itself what normal behavior is. That's how you optimize the ability to predict. Of course, this will have to be constantly adapted, since what was normal last week won't necessary be normal today.
Coen Meerbeek is an Online Performance Consultant at Blue Factory Internet.
Generative AI has recently experienced unprecedented dramatic growth, making it one of the most exciting transformations the tech industry has seen in some time. However, this growth also poses a challenge for tech leaders who will be expected to deliver on the promise of new technology. In 2024, delivering tangible outcomes that meet the potential of AI, and setting up incubator projects for the future will be key tasks ...
SAP is a tool for automating business processes. Managing SAP solutions, especially with the shift to the cloud-based S/4HANA platform, can be intricate. To explore the concerns of SAP users during operational transformations and automation, a survey was conducted in mid-2023 by Digitate and Americas' SAP Users' Group ...
Some companies are just starting to dip their toes into developing AI capabilities, while (few) others can claim they have built a truly AI-first product. Regardless of where a company is on the AI journey, leaders must understand what it means to build every aspect of their product with AI in mind ...
Generative AI will usher in advantages within various industries. However, the technology is still nascent, and according to the recent Dynatrace survey there are many challenges and risks that organizations need to overcome to use this technology effectively ...
In today's digital era, monitoring and observability are indispensable in software and application development. Their efficacy lies in empowering developers to swiftly identify and address issues, enhance performance, and deliver flawless user experiences. Achieving these objectives requires meticulous planning, strategic implementation, and consistent ongoing maintenance. In this blog, we're sharing our five best practices to fortify your approach to application performance monitoring (APM) and observability ...
In MEAN TIME TO INSIGHT Episode 3, Shamus McGillicuddy, VP of Research, Network Infrastructure and Operations, at Enterprise Management Associates (EMA) discusses network security with Chris Steffen, VP of Research Covering Information Security, Risk, and Compliance Management at EMA ...
In a time where we're constantly bombarded with new buzzwords and technological advancements, it can be challenging for businesses to determine what is real, what is useful, and what they truly need. Over the years, we've witnessed the rise and fall of various tech trends, such as the promises (and fears) of AI becoming sentient and replacing humans to the declaration that data is the new oil. At the end of the day, one fundamental question remains: How can companies navigate through the tech buzz and make informed decisions for their future? ...
We increasingly see companies using their observability data to support security use cases. It's not entirely surprising given the challenges that organizations have with legacy SIEMs. We wanted to dig into this evolving intersection of security and observability, so we surveyed 500 security professionals — 40% of whom were either CISOs or CSOs — for our inaugural State of Security Observability report ...
Cloud computing continues to soar, with little signs of slowing down ... But, as with any new program, companies are seeing substantial benefits in the cloud but are also navigating budgetary challenges. With an estimated 94% of companies using cloud services today, priorities for IT teams have shifted from purely adoption-based to deploying new strategies. As they explore new territories, it can be a struggle to exploit the full value of their spend and the cloud's transformative capabilities ...
What will the enterprise of the future look like? If we asked this question three years ago, I doubt most of us would have pictured today as we know it: a future where generative AI has become deeply integrated into business and even our daily lives ...