• December 2009
    M T W T F S S

Give a Humanist a Supercomputer…

Chronicle of Higher Education (12/16/09) Howard, Jennifer

Humanities researchers involved in the National Endowment for the Humanities and the Department of Energy’s high-performance computing (HPC) competition provided updates on their “computationally intensive” humanities projects during a recent Coalition for Networked Information membership meeting in Washington, D.C. A Tufts University team is using supercomputing resources to mine classical texts in the enormous Perseus Digital Library. Meanwhile, David Koller, at the University of Virginia’s Institute for Advanced Technology in the Humanities, is using HPC resources to create digitized, three-dimensional models of cultural objects from museums and archaeological sites. Koller used a complex algorithmic alchemy to convert photographs of the objects into high-resolution images, which offer views from all angles and detail that extends to the level of individual chisel marks. The humanities researchers used HPC resources at the U.S. Department of Energy’s National Energy Research Scientific Computing Center at Lawrence Berkeley National Laboratory. Although HPC resources are more available at universities, Koller believes there need to be more people who know how to help computers and humanities researchers talk to each other.


Scientists, IT Community Await Exascale Computers.

Computerworld (12/07/09) Thibodeau, Patrick

The information technology (IT) industry discussed the challenges it faces in developing exascale systems, a new generation of supercomputers that promise to be far more powerful than existing technology, during the recent SC09 supercomputing conference in Portland, Oregon. With a peak performance of 2.3 petaflops, Jaguar, a Cray XT5 system at Oak Ridge National Laboratory, is the fastest supercomputer today. However, an exaflop would be 1,000 times faster than a petaflop. Exascale systems are expected to appear by 2018, but developers have to improve hardware performance while limiting the use of power. Exascale supercomputers will need to use less memory per core and more memory bandwidth, considering systems running 100 million cores will have continuous core failures. The IT community will have to rethink such issues “in a dramatic kind of way,” says IBM’s Dave Turek. “There are serious exascale-class problems that just cannot be solved in any reasonable amount of time with the computers that we have today,” says Buddy Bland, project director at the Oak Ridge Leadership Computing Facility.


Intel Makes Multi-Million Euro Investment to Create European Exascale Computing Research Center.

PARIS, Nov. 18, 2009

Intel will support the Exascale Computing Research Center with a multi-million Euro investment over a 3-year period. The French Atomic Energy Commission (Commissariat à l’Énergie Atomique), the Versailles Saint-Quentin-en-Yvelines University (Université de Versailles Saint-Quentin-en-Yvelines) and the French National High-Performance Computing Agency (Grand Equipement National de Calcul Intensif) will combine to match Intel’s contribution. This is Intel’s first joint lab in Europe focused exclusively on high-performance computing. It will complement and extend Intel’s existing high-performance computing research programs, investments and initiatives, including the Intel Academic Community Program and European Space Agency’s “Mapping the Globe from Space” project.


Cray Studies Exascale Computing in Europe.

EE Times (12/02/09) Merritt, Rick

Cray has announced the Exascale Research Initiative less than a month after Intel said it was setting up an exascale research center with European partners. Cray’s partnership involves three European institutions, including the University of Edinburgh and the Swiss National Supercomputing Center. The goal of the initiative is to build a supercomputer capable of performing an exaflop, a quintillion calculations per second, by the end of the decade. The research teams will collaborate with Cray’s European software partners. Cray has made an undisclosed investment in the University of Edinburgh’s new Exascale Technology Center, which is scheduled to be formally launched this month. The Swiss researchers are working with Cray as part of the HP2C program, which is studying future large-scale simulation applications.


Single-Atom Transistors Are the Smallest Yet.

PC World (12/04/09) Springmann, Alessondra

A team of researchers from Finland and Australia has discovered a functional transistor that has an active region composed of a single atom. Quantum tunneling is used to move electrons between the single phosphorus atom and the leads of the transistor. The tiny device offers precise control of the changes in voltage on an electrode, which could mean a new generation of atom-scale processors might lead to nano-scale electronics such as computers and other devices. The team’s focus was not “to build the tiniest transistor for a classical computer, but a quantum bit, which would be the heart of a quantum computer that is being developed worldwide,” says Mikko Mottonen, one of the project’s researchers.