• June 2012
    M T W T F S S

Quantum Computer Leap

 Australian National University (05/18/12) Sarina Talip

 Disturbance has been the main technical difficulty in building a quantum computer, but new research from the Australian National University (ANU) suggests that noise could be the key to making a quantum computer operate accurately.  A quantum computer requires developers to address atomic scales and microscopic systems, which are extremely sensitive to noise, says ANU’s Andre Carvalho.  Carvalho and collaborators from Brazil and Spain are proposing adding even more noise to the system.  “We found that with the additional noise you can actually perform all the steps of the computation, provided that you measure the system, keep a close eye on it, and intervene,” Carvalho says.  He notes the outcomes of the measurement cannot be controlled–instead, they are totally random.  As a result, patiently waiting means it would take an infinite amount of time to extract even a very simple computation.  “By choosing smart ways to detect the random events, we can drive the system to implement any desired computation in the system in a finite time,” Carvalho says.


Spin Spirals for Computers of the Future

  Julich Research Center (05/07/12) Angela Wenzik

 Researchers from Julich, Hamburg, and Kiel have demonstrated how magnetic moments in chains of iron atoms could allow information to be transported at the nanoscale in a fast and energy-efficient manner over a wide range of temperatures, while remaining mostly unaffected by external magnetic fields.  “To the best of our knowledge, it is a completely new concept for data transport on this scale,” says Julich Research Center professor Stefan Blugel.  “Because the system is extremely stable and allows information to be transferred in a fast and energy-efficient manner, we believe it is an extremely promising option for future applications.”  The researchers call the spiral arrangement of the magnetic properties in chains of iron atoms “spin spirals,” which were placed in twin rows on an iridium surface for the experiments.  “What is particularly interesting, is the fact that the spin of the atomic screw, which we refer to as chirality in the jargon, is very stable–even at relatively warm temperatures,” Blugel says.  The researchers now plan to study whether the system is stable at higher temperatures, up to and including room temperature.


Project Moon: One Small Step for a PC, One Giant Leap for Data

  Wired News (05/08/12) Robert McMillan

 Virginia Tech researchers launched the MapReduce On Opportunistic Environments (Moon) project five years ago with the goal of turning the university’s Math Emporium, which contains 550 Apple computers, into a type of supercomputer that is based on the same technology that Google developed to power its search engine.  The Project Moon researchers’ paper on the system was recently named one of the most important distributed supercomputing papers in the past 20 years.  “We’re going through technology transfer and trying to figure out how much more we might need to do to package it if people want to license it or to spinoff a company off of it,” says Virginia Tech researcher Wu-chun Feng.  Project Moon is based on Hadoop, the open source version of Google’s MapReduce platform, and it is one of many efforts to apply the platform to more than just Web services.  The Project Moon researchers used Hadoop to turn each Apple computer into a node on a supercomputer, with each machine helping to solve complex data-analysis problems.  In theory, the 550 Apple computers in the Math Emporium could be transformed into a supercomputer capable of performing 6.6 trillion mathematical operations per second.


Bringing Open, User-Centric Cloud Infrastructure to Research Communities

CORDIS News (05/04/12)

 European researchers working on the VENUS-C project have developed an open, scalable, and user-centered cloud computing infrastructure, highlighting an attempt to implement a user-centric approach to the cloud. Cloud computing empowers researchers “in a number of different ways, enabling them not only to do better science by accelerating discovery but also new science they could not have done before,” says VENUS-C project director Andrea Manieri. The new infrastructure integrates easily with users’ working environments and provides on-demand access to cloud resources as and when needed. “Our approach to the interoperability layer tackles current challenges with our users firmly in mind,” Manieri says. The researchers used the VENUS-C infrastructure on Microsoft’s Windows Azure platform to run BLAST, a data-intensive tool used by biologists to find regions of local similarity in amino-acid sequences of different proteins. The VENUS-C infrastructure made the experiment cost less than 600 euros and take just a week to process the data that normally would have taken more than year. “The advantage of using VENUS-C BLAST compared with renting cloud resources and deploying high-performance computing or high-throughput versions of BLAST is that deployment efforts are minimized and client impact is also minimal, since users don’t have to log-in on a different machine,” says VENUS-C’s Ignacio Blanquer.