• September 2013
    M T W T F S S
    « Aug   Oct »
     1
    2345678
    9101112131415
    16171819202122
    23242526272829
    30  

A Better Way for Supercomputers to Search Large Data Sets

Government Computer News (09/09/13) Kevin McCaney 

Lawrence Berkeley National Laboratory researchers have developed techniques for analyzing huge data sets by utilizing “distributed merge trees,” which take better advantage of a high-performance computer’s massively parallel architecture. Distributed merge tree algorithms are capable of scanning a huge data set, tagging the values a researcher is looking for, and creating a topological map of the data. Distributed merge trees separate the data sets into blocks and leverage a supercomputer’s massively parallel architecture to distribute the work across its thousands of nodes, according to Berkeley Lab’s Gunther Weber. He notes the algorithms also can separate important data from irrelevant data. Weber says the new technique will enable researchers to get more out of future supercomputers. “This is also an important step in making topological analysis available on massively parallel, distributed memory architectures,” Weber notes.

MORE

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: