HPE Unveils Computer Built for the Era of Big Data
May 25, 2017
Share this

Hewlett Packard Enterprise (HPE) introduced the world’s largest single-memory computer, the latest milestone in The Machine research project.

The Machine, which is the largest R&D program in the history of the company, is aimed at delivering a new paradigm called Memory-Driven Computing – an architecture custom-built for the Big Data era.

“The secrets to the next great scientific breakthrough, industry-changing innovation, or life-altering technology hide in plain sight behind the mountains of data we create every day,” said Meg Whitman, CEO of Hewlett Packard Enterprise. “To realize this promise, we can’t rely on the technologies of the past, we need a computer built for the Big Data era.”

The prototype unveiled today contains 160 terabytes (TB) of memory, capable of simultaneously working with the data held in every book in the Library of Congress five times over – or approximately 160 million books. It has never been possible to hold and manipulate whole data sets of this size in a single-memory system, and this is just a glimpse of the immense potential of Memory-Driven Computing.

Based on the current prototype, HPE expects the architecture could easily scale to an exabyte-scale single-memory system and, beyond that, to a nearly-limitless pool of memory – 4,096 yottabytes. For context, that is 250,000 times the entire digital universe today.

With that amount of memory, it will be possible to simultaneously work with every digital health record of every person on earth; every piece of data from Facebook; every trip of Google’s autonomous vehicles; and every data set from space exploration all at the same time – getting to answers and uncovering new opportunities at unprecedented speeds.

“We believe Memory-Driven Computing is the solution to move the technology industry forward in a way that can enable advancements across all aspects of society,” said Mark Potter, CTO at HPE and Director, Hewlett Packard Labs. “The architecture we have unveiled can be applied to every computing category – from intelligent edge devices to supercomputers.”

Memory-Driven Computing puts memory, not the processor, at the center of the computing architecture. By eliminating the inefficiencies of how memory, storage and processors interact in traditional systems today, Memory-Driven Computing reduces the time needed to process complex problems from days to hours, hours to minutes, minutes to seconds – to deliver real-time intelligence.

Share this

The Latest

April 23, 2019

More than half (57 percent) of employees surveyed either don’t know (20%) or misinterpret (37%) the meaning of "digital transformation" according to a survey by YouGov, commissioned by Cherwell Software ...

April 18, 2019

A vast majority of organizations are still unprepared to properly respond to cybersecurity incidents, with 77% of respondents indicating they do not have a cybersecurity incident response plan applied consistently across the enterprise, according to The 2019 Study on the Cyber Resilient Organization, a study conducted by the Ponemon Institute on behalf of IBM ...

April 17, 2019

People and businesses today make mistakes similar to Troy, when they get too enamored by the latest, flashiest technology. These modern Trojan Horses work through their ability to "wow" us. Cybercriminals find IoT devices an easy target because they are the cool new technology on the block ...

April 16, 2019

Software security flaws cause the majority of product vulnerabilities, according to the 2019 Security Report from Ixia's Application and Threat Intelligence (ATI) Research Center ...

April 15, 2019

The majority of organizations (nearly 70 percent) do not prioritize the protection of the applications that their business depend on — such as ERP and CRM systems — any differently than how low-value data, applications or services are secured, according to a new survey from CyberArk ...

April 12, 2019

While 97 percent of organizations are currently undertaking or planning to undertake digital transformation initiatives, integration challenges are hindering efforts for 84 percent of organizations, according to the 2019 Connectivity Benchmark Report from MuleSoft ...

April 11, 2019

Companies have low visibility into their public cloud environments, and the tools and data supplied by cloud providers are insufficient, according to The State of Public Cloud Monitoring, a report sponsored by Ixia ...

April 10, 2019

Without improvement in time and budget constraints, the majority of tech pros (75 percent) say they will be unable to confidently manage future innovations, according to IT Trends Report 2019: Skills for Tech Pros of Tomorrow, a new report from SolarWinds. This reality ultimately puts businesses at risk of performance and competitive advantage losses, making the prioritization of skills and career development for tech pros paramount ...

April 09, 2019

Tech pros have one foot grounded in today's hybrid IT realities while also setting their sights on emerging technology, according to IT Trends Report 2019: Skills for Tech Pros of Tomorrow ...

April 08, 2019

This Thursday EMA will be presenting a webinar — Automation, AI and Analytics: Reinventing ITSM — covering recent research. There were quite a few surprises. And in fact, many of the surprises indicated a yet-more-positive outlook than we expected ...