HPE Unveils Computer Built for the Era of Big Data
May 25, 2017
Share this

Hewlett Packard Enterprise (HPE) introduced the world’s largest single-memory computer, the latest milestone in The Machine research project.

The Machine, which is the largest R&D program in the history of the company, is aimed at delivering a new paradigm called Memory-Driven Computing – an architecture custom-built for the Big Data era.

“The secrets to the next great scientific breakthrough, industry-changing innovation, or life-altering technology hide in plain sight behind the mountains of data we create every day,” said Meg Whitman, CEO of Hewlett Packard Enterprise. “To realize this promise, we can’t rely on the technologies of the past, we need a computer built for the Big Data era.”

The prototype unveiled today contains 160 terabytes (TB) of memory, capable of simultaneously working with the data held in every book in the Library of Congress five times over – or approximately 160 million books. It has never been possible to hold and manipulate whole data sets of this size in a single-memory system, and this is just a glimpse of the immense potential of Memory-Driven Computing.

Based on the current prototype, HPE expects the architecture could easily scale to an exabyte-scale single-memory system and, beyond that, to a nearly-limitless pool of memory – 4,096 yottabytes. For context, that is 250,000 times the entire digital universe today.

With that amount of memory, it will be possible to simultaneously work with every digital health record of every person on earth; every piece of data from Facebook; every trip of Google’s autonomous vehicles; and every data set from space exploration all at the same time – getting to answers and uncovering new opportunities at unprecedented speeds.

“We believe Memory-Driven Computing is the solution to move the technology industry forward in a way that can enable advancements across all aspects of society,” said Mark Potter, CTO at HPE and Director, Hewlett Packard Labs. “The architecture we have unveiled can be applied to every computing category – from intelligent edge devices to supercomputers.”

Memory-Driven Computing puts memory, not the processor, at the center of the computing architecture. By eliminating the inefficiencies of how memory, storage and processors interact in traditional systems today, Memory-Driven Computing reduces the time needed to process complex problems from days to hours, hours to minutes, minutes to seconds – to deliver real-time intelligence.

Share this

The Latest

May 26, 2020

Nearly 3,700 people told GitLab about their DevOps journeys. Respondents shared that their roles are changing dramatically, no matter where they sit in the organization. The lines surrounding the traditional definitions of dev, sec, ops and test have blurred, and as we enter the second half of 2020, it is perhaps more important than ever for companies to understand how these roles are evolving ...

May 21, 2020

As cloud computing continues to grow, tech pros say they are increasingly prioritizing areas like hybrid infrastructure management, application performance management (APM), and security management to optimize delivery for the organizations they serve, according to SolarWinds IT Trends Report 2020: The Universal Language of IT ...

May 20, 2020

Businesses see digital experience as a growing priority and a key to their success, with execution requiring a more integrated approach across development, IT and business users, according to Digital Experiences: Where the Industry Stands ...

May 19, 2020

Fully 90% of those who use observability tooling say those tools are important to their team's software development success, including 39% who say observability tools are very important ...

May 18, 2020

As our production application systems continuously increase in complexity, the challenges of understanding, debugging, and improving them keep growing by orders of magnitude. The practice of Observability addresses both the social and the technological challenges of wrangling complexity and working toward achieving production excellence. New research shows how observable systems and practices are changing the APM landscape ...

May 14, 2020
Digital technologies have enveloped our lives like never before. Be it on the personal or professional front, we have become dependent on the accurate functioning of digital devices and the software running them. The performance of the software is critical in running the components and levers of the new digital ecosystem. And to ensure our digital ecosystem delivers the required outcomes, a robust performance testing strategy should be instituted ...
May 13, 2020

The enforced change to working from home (WFH) has had a massive impact on businesses, not just in the way they manage their employees and IT systems. As the COVID-19 pandemic progresses, enterprise IT teams are looking to answer key questions such as: Which applications have become more critical for working from home? ...

May 12, 2020

In ancient times — February 2020 — EMA research found that more than 50% of IT leaders surveyed were considering new ITSM platforms in the near future. The future arrived with a bang as IT organizations turbo-pivoted to deliver and support unprecedented levels and types of services to a global workplace suddenly working from home ...

May 11, 2020

The Internet of Things (IoT) is changing the world. From augmented reality advanced analytics to new consumer solutions, IoT and the cloud are together redefining both how we work and how we engage with our audiences. They are changing how we live, as well ...

May 07, 2020

Despite IT professionals' confidence in their ability to support today's much greater dependence on digital services, there is a rise in application performance errors reported by more than half of consumers, according to the Impact of COVID-19 on Digital Transformation survey from xMatters ...