• March 2010
    M T W T F S S

Researchers Give Update on Road to Parallelism.

EE Times (03/19/10) Merritt, Rick

University of Illinois researchers have taken several small steps toward developing new parallel programming models to tap the many-core processors of the future. The DeNovo project attempts to define a new and more rigorous way of utilizing shared memory. It is working concurrently with a separate effort to define a deterministic, parallel language primarily based on a parallel version of Java and eventually migrating to a parallel version of C++. The chip project that is nearest to testing is the 1,024-core Rigel processor architecture targeting high density, high throughput computing, which would be programmed through a task-level applications programming interface aimed at chores in imaging, computer vision, physics, and simulations. The Bulk Architecture chip design is testing the notion of atomic transactions.


For Quantum Computer, Add a Dash of Disorder.

Science News (03/11/10) Grossman, Lisa

The augmentation of the coupling between light and matter in quantum systems by disorder could eventually lead to fast quantum computers that are easy to construct, according to a study from researchers at the Technical University of Denmark. They have demonstrated that randomly arranged materials can capture light just as well as ordered ones. The researchers built a waveguide featuring holes that were randomly drilled into a gallium arsenide crystal, and also incorporated quantum dots as a substitute for atoms that could become entangled with photons. The quantum dots were induced to emit photons by hitting them with a laser, and the researchers discovered that 94 percent of the photons stayed close to their emitters, creating spots of trapped light in the crystal. The quantum dots also emitted photons 15 times faster after a light spot formed around them. If these light corrals can be entangled with each other, the system could one day support a quantum network in a randomly organized crystal.


Computational Feat Speeds Finding of Genes to Milliseconds Instead of Years.

Stanford University (03/15/10) Vaughn, Christopher

Stanford University computer scientist Debashis Sahoo and computer science professor David Dill recently completed a study of a program based on Boolean logic that can locate specific genes. Starting with two known B-cell genes, Sahoo searched through databases with thousands of gene products in milliseconds and found 62 genes that matched the patterns he would expect to see for genes that got turned on in between the activation of the two genes he started with. He then examined databases involving 41 strains of laboratory mice that were known to be deficient in one or more of the 62 genes. Of those 41 strains, 26 had defects in B-cell development. “Biologists are really amazed that, with just a computer algorithm, in milliseconds I can find genes that it takes them a really long time to isolate in the lab,” Sahoo says. He is currently using the technique to try to find new genes that contribute to cancer development. “This shows that computational analysis of existing data can provide clues about where researchers should look next,” Sahoo says.


Quantum on Quantum.

Science News (02/27/10) Vol. 177, No. 5, P. 28; Petit, Charles

Researchers at Harvard University and Australia’s University of Queensland have designed and constructed a quantum computer capable of simulating and calculating the behavior of a molecular quantum system. The two photons that function as qubits in a quantum device are entangled, meaning that their states are linked and consistent over distance, thus augmenting the quantum computer’s ability to explore all possible solutions to a complex problem at once. The researchers tasked the computer with calculating the energy levels of the hydrogen molecule. Through simulation of the quantum forces inherent in the electrons of atomic bonds themselves, the computer’s photons accurately nailed the energy levels to within 6 parts per million. This milestone is “great, a proof of principle, more evidence that [a quantum computer] is not pie in the sky or cannot be built,” says University of California, Berkeley professor Birgitta Whaley.


What’s Next for High-Performance Computing?

UCSD News (02/24/10) Zverina, Jan

The fusion of high-performance computing (HPC) and high-performance data (HPD) could potentially result in the generation of robust systems that are at least one order of magnitude faster than anything the HPC community currently uses for certain applications, says San Diego Supercomputer Center (SDSC) interim director Michael Norman. Last November, SDSC announced plans to construct Gordon, a data-intensive supercomputer that is expected to read latency-bound files at 10 times the speed and efficiency of current HPC systems with the help of flash memory solid state drives. Ultimately, Gordon will possess 245 teraflops of total compute power, 64 TB of digital random access memory, and 256 TB of flash memory. Gordon also will assist in the integration of HPC and HPD because it is designed for data-intensive predictive science as well as data-mining applications.