Often in Enterprise IT, when the hype dies down the real benefits become visible. So it is with In-Memory Databases and Big Data, two technology concepts whose times it seems, have come. Now, with careful planning and good advice, the buzzwords are turning into competitive advantages.
We are indeed beginning to see true enterprise-wide deployments of SAP, Oracle and EMC’s new In-Memory offerings. All of these products were announced quite a while ago, but the temptation to keep robust SQL-based production systems hard at work is difficult to resist. No enterprise IT manager wants to be the one who tries to replace a trusted production system with unproven technology.
There are two primary drivers which, when they come together, typically prompt a move to In-Memory. The first is the need for speed. When data needs to be analyzed more rapidly than ever, databases accessing traditional storage are too slow.
The second is the phenomenal growth in data volumes. If only one of these is a concern, throwing more cheap storage at the issue can be a temporary fix. When both size and speed matter, the answer may well be In-Memory.
What we are not yet seeing are wholesale migrations away from traditional transactional databases to In-Memory technology — although that may happen in the future. Enterprise IT teams know more than anyone that running several infrastructures in parallel is painful.
Here are five points we recommend considering when moving to In-Memory:
1. Migrate only what you need
Most organizations are not pushing everything to In-Memory due to costs and complexity. Moving only what merits the added horsepower of In-Memory makes sense. This will very much depend on your business processes. The answer also varies by sector, e.g., a high-volume e-tailer versus a manufacturer.
2. Do a cost benefit analysis
When some applications work well in SQL, why tinker? A valid set of metrics allows you to justify both the move to In-Memory and the right performance management solution for your organization.
3. Understand the specifics
You need to know what you are doing. Using new technologies like In-Memory to speed up processing can just expose issues and bottlenecks elsewhere in your system. Just like every layer in the existing application stack from storage to web front ends, In-Memory databases have specific performance issues. Understanding these issues across the architecture speeds up resolutions.
4. Avoid point products
Incorporating smart monitoring of newer components like In-Memory is as important as keeping proven components highly available. In fact, unless there is a system-wide overview, operations teams are likely to be hopping across IT management technologies and system layers to account for new and old. Many of today’s solutions for In-Memory monitoring, claim to be all-encompassing but need to be patched into enterprise systems — avoid them if possible. Ideally, the perspective should be that of the user and to get this, your chosen solution must tie business implications to user experiences.
5. Keep it in context
If monitoring IT systems end-to-end is a win, monitoring the effects of IT performance in a business context is a bigger win. With so many In-Memory use cases being business-driven, In-Memory systems are valued more when monitored in ways the business cares about. Relating performance to business transactions and the ability to predict the business outcomes of In-Memory performance issues is the best way to keep your business colleagues happy.
If you bear all this in mind, In-Memory stands a great chance of delivering what it promises. If you expect this new generation of database technology to have fewer issues than those which preceded it, you will be ignoring the lessons of history. That is why Precise is looking to combine established best-practices with smart ways to exploit the potential of new technologies.
ABOUT Lawrence Baisch
Lawrence Baisch is Vice President, Customer Solutions at Precise Software.