Choosing the right IT management software is sometimes like looking for a needle in a haystack. There's so much to choose from, and it all seems to do the same thing and is claimed to be fantastic.
But things aren't always what they seem. In a world that's changing faster than ever, virtualization and commodity hardware make it extremely difficult for your organization to choose the right tools. To point you in the right direction, I have set out 6 basic rules below. I hope they'll be useful to you.
1. Start from the beginning
Don't assume that the tools you've used in the past will still work.
Many well-established companies complain that parties such as Google and Facebook innovate much faster, have fewer faults and are able to manage with fewer people and lower costs because they're not weighed down by legacy. It's true that having to drag along legacy systems costs time and money, but why should you be left to carry the burden? The same goes for IT management software. If you, as an organization, innovate with the applications, you also have to innovate in this area. Don't assume that the parties who were already around when you started still have the best solutions.
Challenge the dinosaurs.
2. Choose freemium, opt for self-installation and only test in production
There are a number of perceivable trends in IT management software:
■ It must be possible to try out software free of charge, even if the free version has limited features. Even with a limited feature set you can gain a clear impression of the software.
■ You have to be able to install the software yourself without calling in a professional services organization. This is the best way of judging whether the tools are easy to use and manage, and that is a crucial aspect. This hugely shortens ROI and lowers the TCO.
■ And this is actually the most important point: make sure that you test in production before buying. Nothing is worse than discovering that the tools work well in the acceptance environment but create so much overhead in production that they are unusable. Testing in production saves a lot of money and frustration!
3. Be prepared for virtualization
Virtualization is an unstoppable trend in organizations, and your software has to keep pace. There are many implications here. A lot of legacy software is unable to read the right counters or is simply incapable of dealing with environments that are upscaled or downscaled according to usage.
4. Performance = latency or response time, not the use of resources
The most important KPI in the toolset of today and the future is performance, but measured in terms of latency or response time. This should be measured from the end-user to the back end.
Performance used to be measured in terms of resource usage, such as CPU usage. But those days are behind us. In a virtualized environment it's very difficult to determine the effect of what are often inaccurate figures and what this says about the end-user. Probably nothing.
5. Be sure to have 100% cover, not 95%
The 80/20 rule doesn't apply here. The right tool has to cover the entire application landscape. It's important to map out every aspect of the chain, both horizontally and vertically. That doesn't mean that you have to measure everything all the time, but you do need to have access to the right information at the right times.
6. Data must be real time, predictable and complete
Fortunately most legacy tools are real time and complete, but by no means all of them are predictable.
"Real time" speaks for itself. Nothing is achieved if the required data isn't available until hours after the incident. Things move so fast these days that it only takes an hour before the whole country knows you've got a problem, which could harm your image.
"Complete" follows on seamlessly from this. The tool is not up to the job if it takes extra actions to get the information you need. Integrations between several tools are crucial in the software society. Correlating from several sources is vital to everyone's ability to make the right decisions.
"Predictable" is perhaps the most interesting aspect. It takes a lot of work to set up signals to alert you to incidents as soon as possible, and this is often based on settings that were agreed years ago, but who's to say that this is realistic? Who knows what constitutes normal behavior in a virtualized environment? Nobody, which is why it's of paramount importance that the tool you choose learns for itself what normal behavior is. That's how you optimize the ability to predict. Of course, this will have to be constantly adapted, since what was normal last week won't necessary be normal today.
Coen Meerbeek is an Online Performance Consultant at Blue Factory Internet.
A vast majority of organizations are still unprepared to properly respond to cybersecurity incidents, with 77% of respondents indicating they do not have a cybersecurity incident response plan applied consistently across the enterprise, according to The 2019 Study on the Cyber Resilient Organization, a study conducted by the Ponemon Institute on behalf of IBM ...
People and businesses today make mistakes similar to Troy, when they get too enamored by the latest, flashiest technology. These modern Trojan Horses work through their ability to "wow" us. Cybercriminals find IoT devices an easy target because they are the cool new technology on the block ...
Software security flaws cause the majority of product vulnerabilities, according to the 2019 Security Report from Ixia's Application and Threat Intelligence (ATI) Research Center ...
The majority of organizations (nearly 70 percent) do not prioritize the protection of the applications that their business depend on — such as ERP and CRM systems — any differently than how low-value data, applications or services are secured, according to a new survey from CyberArk ...
While 97 percent of organizations are currently undertaking or planning to undertake digital transformation initiatives, integration challenges are hindering efforts for 84 percent of organizations, according to the 2019 Connectivity Benchmark Report from MuleSoft ...
Companies have low visibility into their public cloud environments, and the tools and data supplied by cloud providers are insufficient, according to The State of Public Cloud Monitoring, a report sponsored by Ixia ...
Without improvement in time and budget constraints, the majority of tech pros (75 percent) say they will be unable to confidently manage future innovations, according to IT Trends Report 2019: Skills for Tech Pros of Tomorrow, a new report from SolarWinds. This reality ultimately puts businesses at risk of performance and competitive advantage losses, making the prioritization of skills and career development for tech pros paramount ...
Tech pros have one foot grounded in today's hybrid IT realities while also setting their sights on emerging technology, according to IT Trends Report 2019: Skills for Tech Pros of Tomorrow ...
This Thursday EMA will be presenting a webinar — Automation, AI and Analytics: Reinventing ITSM — covering recent research. There were quite a few surprises. And in fact, many of the surprises indicated a yet-more-positive outlook than we expected ...
Almost three-fourths (69 percent) of organizations have plans to deploy 5G by 2020, according to a new 5G use case and adoption survey by Gartner. Organizations expect 5G networks to be mainly used for IoT communications and video, with operational efficiency being the key driver ...