• August 2017
    M T W T F S S
    « May    

UT Austin’s New Supercomputer Stampede2 Storms Out of the Corral in Support of U.S. Scientists

UT News
Faith Singer-Villalobos
July 28, 2017

The University of Texas at Austin’s (UT Austin) Texas Advanced Computing Center (TACC) has launched Stampede2, the most powerful supercomputer at any U.S. university, which UT Austin president Gregory L. Fenves says will enable researchers “to take on the greatest challenges facing society.” Stampede2 was built with a $30-million National Science Foundation (NSF) grant, and its applications will include large-scale models and data analyses using thousands of processors simultaneously, and smaller computations or interactions via Web-based community platforms. TACC executive director Dan Stanzione predicts Stampede2 “will serve as the workhorse for our nation’s scientists and engineers, allowing them to improve our competitiveness and ensure that UT Austin remains a leader in computational research for the national open science community.” Stampede2 will have a peak performance of 18 petaflops while consuming half the power of Stampede1. It will be made available to researchers via NSF’s Extreme Science and Engineering Discovery


A Scientist and a Supercomputer Re-create a Tornado

UW-Madison News, Eric Verbeten, March 13, 2017

Researchers at the University of Wisconsin-Madison are using supercomputer simulations to study the structure of tornado-producing supercell thunderstorms. The researchers say they can create in-depth visualizations of supercells and discern how they form and ultimately spawn tornadoes. The most recent simulation recreates the “El Reno” tornado, which touched down in Oklahoma in 2011 and caused damage over a 63-mile area. Using real-world observational data, the researchers were able to recreate the weather present at the time of the storm and witness the steps leading up to the creation of the tornado. The simulation reveals in high resolution the numerous “mini-tornadoes” that form at the onset of the main tornado. As the funnel cloud develops, the mini-tornadoes begin to merge, adding strength to the storm. The researchers used the Blue Waters Supercomputer housed at the National Center for Supercomputing Applications at the University of Illinois at Urbana-Champaign.


The Next Supercomputing Superpower–Chinese Technology Comes of Age

Asian Scientist (01/03/17) Rebecca Tan

China has been the ranking leader on the Top500 list of the world’s most powerful supercomputers since June 2013, claiming unsurpassed growth compared to all other countries, according to University of Tennessee professor Jack Dongarra. The ascent of China, which did not even make the Top500 list until 2001, raised fears of its supercomputers being used for nuclear applications, given the growing need for such resources to simulate nuclear tests. Despite a U.S. ban on selling microchips to China, Stony Brook University professor Deng Yuefan says China’s supercomputing progress has continued unabated. One result was the rollout of China’s Shenwei SW26010 chips, which put the Sunway TaihuLight system at the top of the Top500 list with a Linpack benchmark of 93 petaflops and also tripled its predecessor’s efficiency. Deng says China is making investments in software development to put its supercomputers to good use. He says this is evident in the use of Sunway TaihuLight by three of the six finalists for the 2016 ACM Gordon Bell Prize, including the winning team, at the SC16 conference in November. Meanwhile, China also is in a race with Japan and the U.S. to build the first exascale supercomputers.


U.S. Exascale Computing Update With Paul Messina

HPC Wire (12/08/16) Tiffany Trader

In an interview, Distinguished Argonne Fellow Paul Messina discusses stewardship of the Exascale Computing Project (ECP), which has received $122 million in funding (with $39.8 million to be committed to 22 application development projects, $34 million to 35 software development proposals, and $48 million to four co-design centers). Messina notes experiments now can be validated in multiple dimensions. “With exascale, we expect to be able to do things in much greater scale and with more fidelity,” he says. Among the challenges Messina expects exascale computing to help address are precision medicine, additive manufacturing with complex materials, climate science, and carbon capture modeling. “The mission [of ECP] is to create an exascale ecosystem so that towards the end of the project there will be companies that will be able to bid exascale systems in response to [request for proposals] by the facilities, not the project, but the typical DOE facilities at Livermore, Argonne, Berkeley, Oak Ridge, and Los Alamos,” Messina says. In addition to a software stack to meet exascale app needs, Messina says there should be a high-performance computing (HPC) software stack “to help industry and the medium-sized HPC users more easily get into HPC.” Messina also stresses the need for exascale computing to be a sustainable ecosystem.


China’s Policing Robot: Cattle Prod Meets Supercomputer

Computerworld (10/31/16) Patrick Thibodeau

Chinese researchers have developed AnBot, an “intelligent security robot” deployed in a Shenzhen airport. The backend of AnBot is linked to China’s Tianhe-2 supercomputer, where it has access to cloud services. AnBot uses these technologies to conduct patrols, recognize threats, and identify people with multiple cameras and facial recognition. The cloud services give the robots petascale processing power, well beyond the processing capabilities in the robot itself. The supercomputer connection enhances the intelligent learning capabilities and human-machine interface of the devices, according to a U.S.-China Economic and Security Review report that focuses on China’s autonomous systems development efforts. The report found the ability of robotics to improve depends on the linking of artificial intelligence (AI), data science, and computing technologies. In addition, the report notes simultaneous development of high-performance computing systems and robotic mechanical manipulation give AI the potential to unleash smarter robotic devices that are capable of learning as well as integrating inputs from large databases. The report says the U.S. government should increase its own efforts in developing manufacturing technology in critical areas, as well as monitoring China’s growing investments in robotics and AI companies in the U.S.


A Billion Billion Calculations per Second: Where No Computer Has Gone Before

South China Morning Post (Hong Kong) (10/29/16) Viola Zhou

China has launched the development of its first exascale high-performance computer to maintain its lead position in the global supercomputing race. The system will run at 1,000 petaflops, topping the speed of China’s Sunway TaihuLight computer by a factor of 10. China’s Ministry of Science and Technology has allocated funding to three research institutions to devise prototypes to meet its five-year target of putting an exascale computer into operation. The participating institutions include Sugon, the National University of Defense Technology, and the National Research Center of Parallel Computer Engineering and Technology (developer of Sunway TaihuLight, currently the top supercomputer in the world). University of Science and Technology of China professor An Hong says once the prototypes are finished, the ministry will choose two teams with the best designs to construct the fully functioning exascale system. For the first time this year, China dethroned the U.S. as the country with the most supercomputers in the Top500 ranking. “The demand for computing speed has no limits,” An says. “Now we have the money and technology, we can build better computers for scientists to use.”


New Hikari Supercomputer Starts Solar HVDC

Texas Advanced Computing Center (09/14/16) Jorge Salazar

The Hikari computing system at the Texas Advanced Computing Center (TACC) in Austin, TX, is the first supercomputer in the U.S. to use solar and high-voltage direct current (HVDC) for power. Launched by the New Energy and Industrial Technology Development Organization in Japan, NTT FACILITIES, and the University of Texas at Austin, the project aims to demonstrate the potential of HVDC, which allows for ease of connection to renewable energy sources, including solar, wind, and hydrogen fuel cells. During the day, solar panels shading a TACC parking lot provide nearly all of Hikari’s power, up to 208 kilowatts, and at night the microgrid connected to the supercomputer switches back to conventional AC power from the utility grid. The Hikari power feeding system, which is expected to save 15 percent on energy consumption compared to conventional systems, could change how data centers power their systems. The new supercomputer came online in late August, and it consists of 432 Hewlett Packard Enterprise (HPE) Apollo 8000 XL730f servers coupled with HPE DL380 and DL360 nodes interconnected with a first-of-its-kind Mellanox End-to-End EDR InfiniBand system operating at 100 Gbps. More than 10,000 cores from Intel “Haswell” Xeon processors will deliver more than 400 teraflops.