• February 2013
    M T W T F S S

Quantum Algorithm Breakthrough

 University of Bristol News (02/24/13)

An international research group says it has made an important step toward practical quantum computing by implementing a full quantum algorithm without knowing its answer.  Scientists from the University of Bristol and the University of Queensland led the group.  The researchers implemented the phase estimation algorithm, a core central quantum algorithm that achieves an exponential speedup over all classical algorithms.  “Unlike previous demonstrations, we built a full quantum circuit to implement the phase estimation algorithm without any simplification,” says project director Xiao-Qi Zhou.  “We don’t need to know the answer in advance and it is the first time the answer is truly calculated by a quantum circuit with a quantum algorithm.”  The project paves the way for important applications such as quantum simulations and quantum metrology in the near term and factoring in the long term, says professor Jeremy O’Brien, director of Bristol’s Center for Quantum Photonics.  He says quantum algorithms eventually could facilitate the design of new materials, pharmaceuticals, or clean energy devices.


Could Technology Help Catch Lying Politicians?

The Engineer (United Kingdom) (02/22/13) Stephen Harris

Computer scientists say voice-recognition systems eventually could be used in conjunction with artificial intelligence technology to determine if a person is telling the truth. Acorn Computers co-founder Hermann Hauser, for example, has met with Google executives about the possibility of developing an “evidence meter” that could be used to determine if a politician is lying. “The idea is if voice recognition is good enough, which it clearly is now, it can run continuous voice recognition at the bottom of your TV screen whenever they interview David Cameron or the opposition leader,” Hauser says. “So this running evidence-meter below the news item I think could be a very cool thing to implement.” However, although the voice recognition technology is readily available, the artificial intelligence technology that would be required to create such a fact-checking system is more problematic. “The problem is the knowledge that would be required for responding to the queries,” says University College London professor Anthony Hunter. “If the queries were within quite a restricted domain then this is perfectly possible. But [for political speeches] the domains would, by and large, be too broad that you could have a significantly broad knowledge base to check those facts.”


Moving Into the Cloud

CORDIS News (02/14/13)

The European Union’s MobiCloud project aims to create an online collaborative platform using cloud technology to facilitate the development of mobile applications for public transport, construction, and other business-critical areas. End users, mobile developers, application vendors, system integrators, and cloud service providers will come together on MobiCloud to create end-to-end solutions with a large return on investment. MobiCloud offers a mobile mash-up screen that culls data from different corporate information technology systems, and displays different services based on a user’s context, such as location or skill set. These services respond in real time to changes such as work orders or fault reports. MobiCloud will enable smaller companies to quickly develop mobile versions of existing applications, reduce costs, and increase economic growth. The project is co-funded by the European Commission under the ICT Policy Support Program, part of the Competitiveness and Innovation Framework Program.


Supercomputing Crucial to Clean Energy Production

 HPC Wire (02/13/13) Tiffany Trader

The U.S. Office of Fossil Energy’s National Energy Technology Laboratory (NETL) has acquired a 500-teraflop SGI supercomputer to advance energy and environmental research.  Slated to go live in early spring, the High-Performance Computer for Energy and the Environment (HPCEE) has 24,192 2.6-gHz Intel Xeon E5-2670 cores with 48,384 GB of memory in 1,512 computational nodes.  NETL’s Chris Guenther says the system ranked 55th on the latest Top500 list, and 403rd on the Green500 list.  However HPCEE offers cooling and power with a Power Usage Effectiveness (PUE) in the 1.03 to 1.06 range without any special modifications, meaning that just 1 percent of total electrical consumption is used to cool equipment.  This efficiency boost is expected to save NETL an average of $450,000 per year.  Guenther says NETL previously has relied on many small computer clusters, but users have sometimes been unable to find available cores to perform their work.  The new supercomputer will provide researchers with more cores, enabling them to model complex problems.  NETL is developing commercial sorbent-based carbon dioxide capture systems, experimenting with chemical-looping technology to reduce the pollution and cost associated with electricity generation from coal, and working with coal gasification.


How ‘Bullet Time’ Will Revolutionize Exascale Computing

MIT Technology Review (02/12/13)

Kobe University researchers have developed a method for compressing output data without losing its essential features in exascale computing. The approach uses “bullet time,” a Hollywood filming technique that slows down ordinary events while the camera angle changes as if it were flying around the action at normal speed. The technique involves plotting the trajectory of the camera in advance and then placing many high-speed cameras along the route. The Kobe researchers want to use a similar technique to access exascale computer simulations by surrounding the simulated action with millions of virtual cameras that all record the action as it occurs. The compression occurs as each camera records a two-dimensional image of a three-dimensional scene. Using this technique, the footage from a single camera can be compressed into a file about 10 megabytes in size, so even if there are 1 million cameras recording the action, the total amount of data they produce is only about 10 terabytes, according to the researchers. “Our movie data is an order of magnitude smaller,” the researchers say. “This gap will increase much more in larger-scale simulations.”