• February 2012
    M T W T F S S
    « Jan   Mar »
     12345
    6789101112
    13141516171819
    20212223242526
    272829  

NSF Issues Advanced Computing Infrastructure Plan

CCC Blog (02/23/12) Erwin Gianchandani

The U.S. National Science Foundation (NSF) recently released a vision and strategic plan for its Advanced Computing Infrastructure (ACI) designed to support NSF-funded communities and position them at the cutting edge of advanced computing technologies, hardware, and software. ACI is a key part of NSF’s Cyberinfrastructure for 21 Century Science and Engineering framework. NSF “aims to promote a more complementary, comprehensive, and balanced portfolio of advanced computing infrastructure and programs for research and education to support multidisciplinary computational and data-enabled science and engineering that in turn support the entire scientific, engineering, and education community,” the vision report says. NSF says it will promote human capital development and education in computational and data-enabled science and engineering to benefit all fields of science and engineering. NSF aims to achieve its vision by developing foundational and applications research to exploit parallelism and concurrency through innovations in computational technologies. NSF also wants to build, test, and deploy sustainable and innovative resources into a collaborative ecosystem. In addition, NSF wants to develop comprehensive education and workforce programs, as well as transformational and grand challenge community programs that support contemporary complex problem solving.

MORE

Advertisements

National Science Foundation Steps Up Its Push for Interdisciplinary Research

Chronicle of Higher Education (02/13/12) Paul Basken

The U.S. National Science Foundation (NSF) is dispatching top official and University of Michigan professor Myron P. Gutmann to college campuses to promote the need for greater interdisciplinary research if they wish to win NSF grants. Gutmann notes that such research has yielded rapid advances in various fields, such as healthcare applications of atomic-scale science and the study of extreme weather events through analysis of both natural and social variables. NSF director Subra Suresh has prioritized the push for more interdisciplinary research since his arrival in October 2010. Emphasizing more interdisciplinary research is both financially and scientifically sensible, says Columbia University professor Mark C. Taylor. He notes that graduates are becoming too specialized to find employment due to the unsustainable nature of department-based hierarchies. Economic anxiousness could aid the NSF in its interdisciplinary efforts by making universities and their researchers particularly keen to comply with its mandate. Gutmann notes that NSF still believes in the importance of traditional disciplines, and says that in his department about 33 percent of research grants are interdisciplinary. “It doesn’t need to be 100 percent,” he says. “But it might want to be 60 percent.”

MORE

Minister Launches Next Generation of Supercomputers for UK Researchers

University of Edinburgh (02/13/12)

 The University of Edinburgh’s Advanced Computing Facility recently launched the High-End Computing Terascale Resources (HECToR) and BlueGene/Q, two supercomputers that can deliver complex computer simulations in a range of scientific fields.  The computers’ capacity and performance will help United Kingdom (U.K.) researchers study climate change, the fundamental structure of matter, fluctuations in ocean currents, projecting the spread of epidemics, designing new materials, the structure and evolution of the universe, and developing new medicinal drugs.  The supercomputers “will provide U.K. businesses and researchers with the technology they need to compete successfully on a global scale,” says Minister for Universities & Science David Willetts.  “HECToR and BlueGene/Q will each play a significant role in facilitating ground-breaking research across many areas of science, with tremendous benefits for society,” says Edinburgh professor Sir Timothy O’Shea.  Both the BlueGene/Q and HECToR facilities have about 800 teraflops of computational power.  HECToR uses the latest Bulldozer multicore processor architecture, which allows twice the performance over the old architecture.  BlueGene/Q can perform the calculation of 100 laptops using the same level of electricity used to power a light bulb.

MORE

NSF Releases Report on Cloud Computing

CCC Blog (02/07/12) Erwin Gianchandani

The U.S. National Science Foundation (NSF) recently released a report on the organization’s support for cloud computing, describing the research as a vital area of national importance that requires further research and development. The report highlights some of the 125 cloud computing research awards issued by NSF’s Computer and Information Science and Engineering (CISE) directorate between 2009 and 2011, in fields such as architecture, algorithms, big data, security and privacy, and green computing. “The CISE directorate is currently considering future directions for cloud computing research under the working title Science and Engineering of Cloud Computing, [which] is intended to address the important questions of how to design correct and efficient future cloud systems, rather than how to utilize existing cloud systems,” the report says. The research is a collaboration between various technical areas, including computer systems, networks, security, computer architecture and software, and databases and data-intensive computing. “NSF anticipates that the CISE directorate will continue its support for cloud computing in future years, primarily driven by the [Computer & Network Systems] Division, with participation by … other directorates or offices as appropriate,” the report says.

MORE

Researchers Claim 100-Fold Increase in Data Storage Speed

Network World (02/08/12) Tim Greene

Researchers at the universities of York and Nijmegen have developed a method for accelerating data storage hundredfold. The researchers say that if successfully translated to a storage product, the technology could theoretically cut the time to store a bit of data on a hard disk drive from a billionth of a second to a hundred-billionth of a second. For example, the method could decrease the time it takes to store a terabyte of data from about 22 minutes to about a minute and 20 seconds. The researchers developed the technology by heating a magnetic material with laser bursts that alter the magnetic spin of the material at the atomic level. The researchers note that current storage on hard drives changes spin using a fluctuating magnetic field, which is slower. Their method uses X-ray Magnetic Circular Dichrosim to examine thin films of an alloy of gadolinium iron and cobalt to study ultrafast spin reversal.

MORE

U.S. to Use Climate to Help Cool Exascale Systems

Computerworld (02/08/12) Patrick Thibodeau

The U.S. Department of Energy’s (DOE’s) Berkeley Lab has started building a computing center that will one day hold exascale systems. DOE recently gave Congress a report outlining a plan to deliver exascale computing by 2019-2020 and its expected cost. Berkeley’s Computational Research and Theory (CRT) facility will use outside air cooling, relying on the Bay Area’s cool temperatures to meet its needs about 95 percent of the time, says CRT’s Katherine Yelick. The evaporative cooling method involves hot water being transported into a tower where evaporation helps it cool. The 140,000-square-foot building, expected to be ready in 2014, will enable Berkeley Lab to combine offices that are split between two sites, and it will be large enough to house two supercomputers, including exascale-sized systems. The exascale system will be able to reach 1 quintillion floating point operations per second. The Berkeley facility “is very representative of what we have that’s best in the United States in research, in innovation,” says DOE secretary Steve Chu. He notes that computation will be “a key element in helping further the innovation and the industrial competitiveness of the United States.”

MORE

PRACE to Establish Six Advanced HPC Training Centers.

PRACE (02/02/12)

The Partnership for Advanced Computing in Europe (PRACE) has selected the Barcelona Supercomputing Center, the CSC-IT Center for Science, the University of Edinburgh, Cineca, Maison de la Simulation, and the Gauss Center for Supercomputing as PRACE Advanced Training Centers (PATCs). PATCs will provide training and education activities to the European research community on utilizing PRACE’s computational infrastructure. PRACE ultimately wants the PATCs to serve as hubs and key drivers of European high-performance computing (HPC) education. “The establishment of the PRACE Advanced Training Centers is one of the most visible achievements of PRACE to the European researchers,” says CSC’s Pekka Manninen. “The PATC network enables us to synergize European HPC training activities for the benefit of the whole of Europe.” PRACE has initially selected six of its member sites as PATCs, but it will assess the location centers every two years, and the sites may vary over time.

MORE