DOE Research Group Makes Case for Exascale.

HPC Wire (02/21/11) Tiffany Trader

The U.S. Department of Energy’s (DOE’s) Office of Advanced Scientific Computing Research recently published an article stating that although exascale computing has the potential to lead to scientific breakthroughs, the technology will not be easy or inexpensive to develop. Exascale computing could lead to precise long-range weather forecasting, new alternative fuels, and advances in disease research, according to the DOE paper. However, creating an exascale system faces many obstacles, says Argonne National Laboratory’s Rick Stevens. An exascale system will require billions of cores, so there needs to be an effective model that can take advantage of all of them, in what will likely be an extreme parallel system. An exascale system also would require more than a gigawatt of electricity, which could only come from its own power plant. Stevens says researchers are looking to graphics processing units as a way to minimize energy requirements. He also notes that computer reliability issues will be magnified a thousandfold in an exascale system. All of these issues will require government funding to solve, so “complex and coordinated [research and development] efforts [are required] to bring down the cost of memory, networking, disks, and all of the other essential components of an exascale system,” Stevens says.

MORE

Toward Computers That Fit on a Pen Tip: New Technologies Usher in the Millimeter-Scale Computing Era.

University of Michigan News Service (02/22/11) Nicole Casal Moor

University of Michigan researchers, led by professors Dennis Sylvester, David Blaauw, and David Wentzloff, recently presented papers at the International Solid-State Circuits Conference in which they discussed a prototype implantable eye pressure monitor for glaucoma patients and a compact radio that does not need to be tuned to find a signal and could be used to track pollution, monitor structural integrity, or perform surveillance. The research utilizes millimeter-scale technologies to create devices for use in ubiquitous computing environments. The glaucoma eye pressure monitor is slightly larger than one cubic millimeter and contains an ultra-low-power microprocessor, a pressure sensor, memory, a thin-film battery, a solar cell, and a wireless radio transmitter that sends data to an external reading device. “This is the first true millimeter-scale complete computing system,” Sylvester says. Wentzloff and doctoral student Kuo-Ken Huang have developed a tiny radio with an on-chip antenna that can keep its own time and serve as its own reference, which enables the system to precisely communicate with other devices. “By designing a circuit to monitor the signal on the antenna and measure how close it is to the antenna’s natural resonance, we can lock the transmitted signal to the antenna’s resonant frequency,” Wentzloff says.

MORE

Obama Sets $126M for Next-Gen Supercomputing.

Computerworld (02/17/11) Patrick Thibodeau

President Obama’s 2012 budget proposal calls for $126 million for the development of next-generation exascale supercomputers, with about $91 million going to the U.S. Department of Energy’s (DOE’s) Office of Science and $36 million going to the National Nuclear Security Administration. The funding is part of a general DOE advanced computing request of $465 million for 2012, which marks a 21 percent increase over the 2010 budget. Exascale systems will beat the power of the fastest current supercomputer by 1,000-fold, and the White House’s funding for such systems reflects its plan for a predictable future pathway for high-performance computing. The creation of an exascale system is expected by 2020, but that depends on the development of software systems that can use what may amount to 100 million cores. Meanwhile, DOE is constructing 10 petaflop systems. Modeling and simulation are the chief supercomputing applications, and with an increase in system size comes a gain in resolution. Faster networking and other technological milestones that must be achieved to build exascale systems may eventually migrate to business-class servers.

-MORE-

Energy Aims to Retake Supercomputing Lead From China.

Government Computer News (02/11/11) Henry Kenyon

The U.S. Department of Energy’s (DOE’s) Argonne National Laboratory has commissioned the development of a supercomputer that will be capable of executing 10 petaflops. IBM will build the machine, which will be based on a version of the latest Blue Gene supercomputer architecture. The supercomputer will be operational in 2012, and its performance will be vastly superior to today’s most powerful supercomputer, China’s Tianhe-1A system, which has a peak performance of 2.67 petaflops. The system also will be the most energy-efficient computer in the world due to a combination of new microchip designs and very efficient water cooling. The supercomputer, which will be housed at the Argonne Leadership Computing Facility, will be used to conduct a variety of modeling and simulation tests that current machines are unable to perform. By 2012, IBM also will be responsible for two other systems operating at 10 petaflops or higher–the 20 petaflop Sequoia for the DOE’s Lawrence Livermore National Laboratory and the 10 petaflop Blue Waters system for the National Center for Supercomputing Applications at the University of Illinois at Urbana-Champaign. The new class of supercomputers is expected to pave the way for the emergence of exascale computers–machines that are 1,000 times faster than petascale systems–by the end of the decade.

-MORE-

UF Leads World in Reconfigurable Supercomputing.

University of Florida News (02/15/11) Ron Word

University of Florida researchers say the Novo-G is the world’s fastest reconfigurable supercomputer and that it is capable of executing some key science applications faster than the Chinese Tianhe-1A system, which was rated the world’s most powerful supercomputer in the Top500 list in November. Florida professor Alan George notes that the Top500 list scores systems based on their performance of a few basic routines in linear algebra using 64-bit, floating-point arithmetic. He says many important apps do not comply with that standard, and software apps for most computers must conform to fixed-logic hardware structures that can slow down computing speed and boost energy consumption. However, reconfigurable systems feature architecture that can adjust to match each app’s unique requirements, leading to higher speed and more energy efficiency due to adaptive hardware customization. Novo-G employs 192 reconfigurable processors and “can rival the speed of the world’s largest supercomputers at a tiny fraction of their cost, size, power, and cooling,” according to the researchers, who say it is particularly well suited for applications in genome research, cancer diagnosis, plant science, and large data set analysis.

-MORE-

New Supercomputers Boost Imaging Grunt.

ZDNet Australia (02/08/11) Colin Ho

IBM recently announced that the Australian Synchrotron and Monash University has purchased two supercomputers, to be use in collaboration with the Commonwealth Scientific and Industrial Research Organisation (CSIRO) and the Victorian Government, to create a near real-time atomic-level imaging and visualization facility. The supercomputers will enable researchers to study objects at an atomic level, create three-dimensional images, and process large amounts of data collected by the program. The joint Monash-CSIRO program, called the Multi-modal Australian ScienceS Imaging and Visualisation Environment (MASSIVE) facility, will study a variety of topics, ranging from biology to geology. “The unique nature of these facilities is the focus on imaging and visualization,” says Monash University’s Wojtek Goscinski. The merger of atomic-level detail and semi-real-time analysis makes MASSIVE an important move forward for scientific research, says Australian Synchrotron director Andrew Peele.

-MORE-

Super Computer to Be Used for Agricultural Research.

Daily News & Analysis (India) (02/09/11) Arun Jayan

The Indian Council of Agricultural Research is building a national agricultural bioinformatics grid with assistance from the Center for Development of Advanced Computing (C-DAC). The grid is designed to improve agricultural productivity and help address issues such as food security. “Now scientists have to wait for a production cycle to get over to analyze various issues like quality of seed, produce, and weather pattern,” says C-DAC’s Goldi Misra. However, high-performance computers could be used for such analysis instead, Misra says. The first phase of the project will focus on connecting government agencies with high-speed networks. Agricultural universities and research centers also could be added to the grid to enable researchers to perform complex analytical processes. The grid will provide computational support for high-quality research in agriculture and biotechnology, says Indian Agricultural Statistics Research Institute researcher Anil Rai. “This will lead to the development of superior varieties [of] seeds, the right fertilizers, and will help various other processes to enhance agricultural productivity on sustainable basis,” he says.

-MORE-