• August 2011
    M T W T F S S
    « Jul   Sep »
    1234567
    891011121314
    15161718192021
    22232425262728
    293031  

SDSC Readying ‘Gordon’ Supercomputer for Pre-Production Trials This Month

 UCSD News (CA) (08/09/11) Jan Zverina

 The University of California, San Diego’s San Diego Supercomputer Center (SDSC) this month will launch the pre-production phase of Gordon, the first high-performance supercomputer equipped with large amounts of flash-based solid state drive memory.  Gordon, which has 64 input/output nodes joined by an InfiniBand switch fabric communications link, will be available to U.S. researchers that want to run large-scale database applications, according to SDSC director Michael Norman.  Gordon features about 300 trillion bytes of flash memory that can handle massive databases at speeds up to 100 times faster than hard drive disk systems, according to Norman.  “Now we have enterprise [multi-level cell], and it’s available at both attractive prices and with very good durability (or write endurance), which is achieved by over-provisioning and wear leveling,” he says.  Gordon is designed to help researchers in computational science, visual analytics, and interaction network analyses.  “Data of this size is simply becoming unmanageable for analysis, so there is an urgent need for supercomputers like Gordon,” Norman says.

MORE

How Computational Complexity Will Revolutionize Philosophy

Technology Review (08/10/11)

Massachusetts Institute of Technology computer scientist Scott Aaronson argues that computational complexity theory will have a transformative effect on philosophical thinking about a broad spectrum of topics such as the challenge of artificial intelligence (AI). The theory focuses on how the resources required to solve a problem scale with some measure of the problem size, and how problems typically scale either reasonably slowly or unreasonably rapidly. Aaronson raises the issue of AI and whether computers can ever become capable of human-like thinking. He contends that computability theory cannot provide a fundamental impediment to computers passing the Turing test. A more productive strategy is to consider the problem’s computational complexity, Aaronson says. He cites the possibility of a computer that records all the human-to-human conversations it hears, accruing a database over time with which it can make conversation by looking up human answers to questions it is presented with. Aaronson says that although this strategy works, it demands computational resources that expand exponentially with the length of the conversation. This, in turn, leads to a new way of thinking about the AI problem, and by this reasoning, the difference between humans and machines is basically one of computational complexity.

MORE

IBM, NCSA Abandon Petascale Supercomputer Project.

IDG News Service (08/08/11) Joab Jackson

Unforeseen complexities and greater-than-anticipated costs have prevented IBM and the University of Illinois’ National Center for Supercomputing Applications (NCSA) from continuing plans to build a petaflop-speed supercomputer. The first version of the system was expected to be delivered as early as 2012. “The innovative technology that IBM ultimately developed was more complex and required significantly increased financial and technical support by IBM beyond its original expectations,” according to a joint statement. “NCSA and IBM worked closely on various proposals to retain IBM’s participation in the project but could not come to a mutually agreed-on plan concerning the path forward.” The U.S. National Science Foundation and the University of Illinois funded the Blue Waters project, which aimed to build a Power7 processor-based supercomputer capable of a quadrillion floating point operations per second. IBM contributed software, hardware, and personnel for Blue Waters, which proved too costly, and may have needed to redefine the project to take advantage of the latest techniques for ramping up computing to petascale heights, says Envisioneering Group’s Rick Doherty.

MORE