• June 2012
    M T W T F S S
    « May   Jul »
     123
    45678910
    11121314151617
    18192021222324
    252627282930  

Massachusetts Offers a New Model for Academic HPC.

HPC Wire (05/29/12) Robert Gelber

 Several universities in Massachusetts will share high-performance computing (HPC) resources in a unique facility model by the end of the year.  The Massachusetts Green High-Performance Computing Center (MGHPCC) will feature terascale hardware and the necessary infrastructure to enable its users to remotely access computing resources, including power, network, and cooling systems.  University members include the Massachusetts Institute of Technology, Harvard University, Boston University, Northeastern University, and the University of Massachusetts system, and they must develop new strategies of implementation.  The universities will provide their own hardware and migrate research to the $95 million center.  MGHPCC executive director John Goodhue says the challenge will be to make the physical hardware act as a set of private local machines for the various users.  He says high-bandwidth network pipes and machine virtualization should make that possible.  The center could serve as the model for future HPC collaboration if it proves to be a success, but Rensselaer Polytechnic Institute professor Francine Berman says “social engineering of the stakeholders is dramatically difficult.”  Potential challenges may entail MGHPCC’s ability to generate groundbreaking research and papers versus expanding its user base, which usually receives less attention.

MORE

21st Century Computer Architecture.

CCC Blog (05/29/12) Erwin Gianchandani

The Computing Community Consortium (CCC) has released “21st Century Computer Architecture,” a white paper developed by members of the computer architecture research community designed to guide strategic thinking. The paper also aims to complement and synthesize other recent documents. The paper notes that information and communication technology (ICT) is transforming the world, including the fields of healthcare, education, science, commerce, government, defense, and entertainment. Evidence suggests that ICT innovation is accelerating with many compelling visions moving from science fiction toward reality, the paper adds. The combined effect of technology and architecture has provided ICT innovators with exponential performance growth at near constant cost, the paper concludes. In addition, the paper says higher performance has both made more computationally demanding applications feasible and made less demanding applications easier to develop by enabling higher-level programming abstractions. The paper also describes the current inflection point in ICT as well as the opportunities the community has in the years ahead.

MORE

Quantum Computer Leap

 Australian National University (05/18/12) Sarina Talip

 Disturbance has been the main technical difficulty in building a quantum computer, but new research from the Australian National University (ANU) suggests that noise could be the key to making a quantum computer operate accurately.  A quantum computer requires developers to address atomic scales and microscopic systems, which are extremely sensitive to noise, says ANU’s Andre Carvalho.  Carvalho and collaborators from Brazil and Spain are proposing adding even more noise to the system.  “We found that with the additional noise you can actually perform all the steps of the computation, provided that you measure the system, keep a close eye on it, and intervene,” Carvalho says.  He notes the outcomes of the measurement cannot be controlled–instead, they are totally random.  As a result, patiently waiting means it would take an infinite amount of time to extract even a very simple computation.  “By choosing smart ways to detect the random events, we can drive the system to implement any desired computation in the system in a finite time,” Carvalho says.

MORE

Spin Spirals for Computers of the Future

  Julich Research Center (05/07/12) Angela Wenzik

 Researchers from Julich, Hamburg, and Kiel have demonstrated how magnetic moments in chains of iron atoms could allow information to be transported at the nanoscale in a fast and energy-efficient manner over a wide range of temperatures, while remaining mostly unaffected by external magnetic fields.  “To the best of our knowledge, it is a completely new concept for data transport on this scale,” says Julich Research Center professor Stefan Blugel.  “Because the system is extremely stable and allows information to be transferred in a fast and energy-efficient manner, we believe it is an extremely promising option for future applications.”  The researchers call the spiral arrangement of the magnetic properties in chains of iron atoms “spin spirals,” which were placed in twin rows on an iridium surface for the experiments.  “What is particularly interesting, is the fact that the spin of the atomic screw, which we refer to as chirality in the jargon, is very stable–even at relatively warm temperatures,” Blugel says.  The researchers now plan to study whether the system is stable at higher temperatures, up to and including room temperature.

MORE

Project Moon: One Small Step for a PC, One Giant Leap for Data

  Wired News (05/08/12) Robert McMillan

 Virginia Tech researchers launched the MapReduce On Opportunistic Environments (Moon) project five years ago with the goal of turning the university’s Math Emporium, which contains 550 Apple computers, into a type of supercomputer that is based on the same technology that Google developed to power its search engine.  The Project Moon researchers’ paper on the system was recently named one of the most important distributed supercomputing papers in the past 20 years.  “We’re going through technology transfer and trying to figure out how much more we might need to do to package it if people want to license it or to spinoff a company off of it,” says Virginia Tech researcher Wu-chun Feng.  Project Moon is based on Hadoop, the open source version of Google’s MapReduce platform, and it is one of many efforts to apply the platform to more than just Web services.  The Project Moon researchers used Hadoop to turn each Apple computer into a node on a supercomputer, with each machine helping to solve complex data-analysis problems.  In theory, the 550 Apple computers in the Math Emporium could be transformed into a supercomputer capable of performing 6.6 trillion mathematical operations per second.

MORE

Bringing Open, User-Centric Cloud Infrastructure to Research Communities

CORDIS News (05/04/12)

 European researchers working on the VENUS-C project have developed an open, scalable, and user-centered cloud computing infrastructure, highlighting an attempt to implement a user-centric approach to the cloud. Cloud computing empowers researchers “in a number of different ways, enabling them not only to do better science by accelerating discovery but also new science they could not have done before,” says VENUS-C project director Andrea Manieri. The new infrastructure integrates easily with users’ working environments and provides on-demand access to cloud resources as and when needed. “Our approach to the interoperability layer tackles current challenges with our users firmly in mind,” Manieri says. The researchers used the VENUS-C infrastructure on Microsoft’s Windows Azure platform to run BLAST, a data-intensive tool used by biologists to find regions of local similarity in amino-acid sequences of different proteins. The VENUS-C infrastructure made the experiment cost less than 600 euros and take just a week to process the data that normally would have taken more than year. “The advantage of using VENUS-C BLAST compared with renting cloud resources and deploying high-performance computing or high-throughput versions of BLAST is that deployment efforts are minimized and client impact is also minimal, since users don’t have to log-in on a different machine,” says VENUS-C’s Ignacio Blanquer.

MORE