Tuesday, Oct 26, 2021 | Last Update : 09:01 PM IST

  Technology   Gadgets  17 May 2017  Hewlett Packard Enterprise reveals powerful computer prototype

Hewlett Packard Enterprise reveals powerful computer prototype

REUTERS
Published : May 17, 2017, 8:50 am IST
Updated : May 17, 2017, 8:50 am IST

The prototype underscores HPE's ambition to lead computer technology as huge datasets place new strains on devices.

It is the latest prototype from
 It is the latest prototype from "The Machine" research project by HPE, which aims to create super-fast computers by designing them around memory.

Hewlett Packard (HP) on Tuesday unveiled a new computer prototype that it said could handle more data than any similar system in the world.

The Palo Alto, California-based company said the prototype contains 160 terabytes of memory, capable of managing the information from every book in the US Library of Congress five times over.

 

It is the latest prototype from "The Machine" research project by HPE, which aims to create super-fast computers by designing them around memory. Traditionally, the way processors, storage and memory interact can bog down computers.

The prototype underscores HPE's ambition to lead computer technology as huge datasets place new strains on devices.

"We need a computer built for the Big Data era," HPE's Chief Executive Meg Whitman said in a news release.

While large data centers that piece together many computers may have enough calculating power, they cannot transfer data efficiently, said Kirk Bresniker, Hewlett Packard Labs Chief Architect, in an interview. That means HPE's single-system model may one day compete with the infrastructure spearheaded by cloud-computing companies like Amazon.com.

 

HPE expects its model will over time contain more and more memory. While the prototype remains years away from being commercially available, HPE is already bringing some of the tech from its research program to market.

Still, companies and the scientific community have yet to concur on what technology will best serve users.

"You need computing that scales up with the size of the dataset," said Kathy Yelick, a professor of electrical engineering and computer sciences at the University of California at Berkeley.

There's still discussion "about what the right answer is."

Tags: hewlett packard, prototype, data, memory