• July 2012
    M T W T F S S

Internet2 Readies 100G OpenFlow SDN for Big Data

Network World (07/09/12) Jim Duffy

Internet2 has nearly completed its OpenFlow-enabled 100G Ethernet software-defined network (SDN) for testing service delivery of applications for Big Data compilation and research.  The 100G Ethernet OpenFlow-enabled routers will allow for programmatic control of the Innovation Platform from an open source controller to facilitate scale and intelligent service delivery, according to Internet2.  The Innovation Platform is designed to advance education, university business, and global Big Data collaborative research outcomes, which should lead to new research initiatives and new cycles of global economic development, according to Internet2.  “The Internet2 community sees software-defined networking as much of the same transformative opportunity that we saw with the original Internet,” says Internet2’s Rob Vietzke.  “We’re making a fairly big investment in building this new nationwide SDN environment as a platform for software development.”  He says the Innovation Platform will enable member institutions to keep up with the exponential growth of Big Data generated by scientific research from U.S. labs and universities.


Toward Achieving 1 Million Times Increase in Computing Efficiency

Northwestern University Newscenter (07/10/12)

Complementary metal-oxide semiconductors (CMOS) give off more heat as more transistors are added, which makes CMOS incapable of supporting tomorrow’s high-powered computer systems.  Northwestern University researchers say they have developed a new logic circuit family based on magnetic semiconductor devices that could result in logic circuits up to one million times more power-efficient than CMOS-based systems.  Northwestern’s “spin-logic circuits” utilize the quantum physics phenomenon of spin, a fundamental property of the electron.  “We are using ‘spintronic’ logic devices to successfully perform the same operations as conventional CMOS circuits but with fewer devices and more computing power,” says Northwestern professor Bruce W. Wessels.  The spin-logic circuits are created using magnetoresistive bipolar spin-transistors.  Although the goal of one million times increased power efficiency is optimistic and could take up to 10 years to reach, “we think this is potentially groundbreaking,” says Northwestern’s Joseph Friedman.


DOE Primes Pump for Exascale Supercomputers

HPC Wire (07/12/12) Michael Feldman

The U.S. Department of Energy (DOE) recently awarded two-year, multimillion-dollar grants to Intel, Advanced Micro Devices (AMD), NVIDIA, and Whamcloud to develop exascale computers as part of the FastForward program, which will focus on developing future hardware and software technologies capable of supporting such machines.  The program is being contracted through Lawrence Livermore National Security as part of a multi-lab consortium that includes Argonne National Laboratory, Lawrence Berkeley National Laboratory, Lawrence Livermore National Laboratory, Los Alamos National Laboratory, Oak Ridge National Laboratory, Pacific Northwest National Laboratory, and Sandia National Laboratories.  “While DOE’s extreme-scale computer requirements are a driving factor, these projects must also exhibit the potential for technology adoption by broader segments of the market outside of DOE supercomputer installations,” says a FastForward statement.  Intel’s FastForward processor research will be based on the company’s Many Integrated Core architecture, which is designed for the supercomputing market.  AMD will be basing its FastForward processor research on its Accelerated Processing Unit product line and the related Heterogeneous Systems Architecture standard.  NVIDIA will base its FastForward research on its Echelon design.


Exascale Computing by Decade’s End

EE Times India (07/12/12) Dylan McGrath

Parallelism and technology scaling will make exascale computing possible by the end of the decade, says Intel Fellow Shekhar Borkar.  By about 2018, engineers are expected to create an exascale supercomputer, and about 10 years later the technology will likely find its way into PCs and eventually into mobile systems.  Still, exascale computing would consume vast amounts of power if current trends hold true, Borkar notes.  A key challenge will be to build an exascale computing system that consumes only 20MW of power, and engineers would be able to use the same technology to significantly reduce the power consumption of lower performance systems.  Borkar says improvements must be made in both energy per transistor and energy per compute operation.  He notes that scaling down supply voltage boosts energy efficiency, but the side effect of this is leakage power not declining as much as total power consumption, which makes leakage power a higher percentage of total power consumption.  Borkar makes several other observations, such as the importance of local computing due to the issue of power consumption.  “Clearly, data movement energy will dominate the future,” he says.


Reimer Perfects New Techniques for Spintronics and Quantum Computing

University of California, Berkeley (07/05/12)

 Researchers the University of California, Berkeley and the City College of New York are developing techniques to overcome the physical limitations of computer chips in order to create the next generation of faster and smaller electronic devices.  The researchers are using lasers to control the fundamental nuclear spin properties of semiconductor materials to speed the creation of spintronic devices that use electrons’ spin state to control memory and logic circuits.  “Our laser techniques can allow quantum computing to become far more practical and inexpensive,” says Berkeley’s Jeff Reimer.  He notes that spintronics enables computer chips to operate more quickly and with less power.  “Now we want to use this knowledge to develop better spintronic devices,” Reimer says.  The researchers have found they can use circularly polarized laser beams to control spin states in gallium arsenide.  “By tuning the laser to just the right intensity and frequency, and by picking the isotopes of gallium and arsenic we use for the semiconductor material, we can control the spins in the semiconductor by using polarized laser light,” Reimer says.


Google Glass Launches New Age of Personal Computing

Computerworld (07/05/12) Sharon Gaudin

Google’s recently unveiled computerized eyeglasses could mark the beginning of a new computing era in which wearable computers are common.  The Google Glass development effort is all about “doing brand new risky technological things that are really about making science fiction real,” says Google cofounder Sergey Brin.  He says the next generation of computers likely won’t sit on a desk or have keyboards or monitors.  Meanwhile, analysts predict that future computers will be incorporated into other items that people use, such as clothing or jewelry.  “I believe that in five years we will see many different form factors and brands of wearable computers,” says analyst Patrick Moorhead.  He says Google’s research could lead to the mainstream use of these new technologies.  “We can go beyond the glasses and visualize computers in our jewelry, in our watches, and even inside our bodies,” Moorhead says.  Google Glass and other wearable computers also could be very useful in many workplaces, notes analyst Rob Enderle.  “They could be used regularly for things like taking inventory in warehouses, and for tasks on factory floors and other places where folks need to use computers and their hands at the same time,” Enderle says.


Princeton Researchers Working at Forefront of ‘Exascale’ Supercomputing

Princeton University (06/28/12) Gale Scott

 Princeton University researchers are developing algorithms designed for exascale supercomputers that will enable scientists to address problems that were previously too difficult to solve.  The Group of Eight’s (G-8’s) Research Councils Initiative on Multilateral Research Funding recently awarded grants to Princeton researchers William Tang, Jeroen Tromp, and Venkatramani Balaji to develop the algorithms.  “What we hope to demonstrate is that this focused level of international scientific collaboration can help deliver breakthrough payoffs in high-performance computing,” Tang says.  The grants are part of a G-8 pilot project established in 2010 to facilitate multinational collaboration among scientists.  Tang will use his grant to develop advanced simulation software that will be compatible with exascale supercomputers.  The goal of the project, called NuFuSe, is to produce higher-fidelity simulations of the physics behind fusion reactions.  Balaji is working on a project to design software that will organize huge archives of climate data from around the world.  For example, Balaji says “a scientist at work on malaria might want to know how many mosquitoes are likely to be in a region in the future, which means predicting temperature or humidity.”  Tromp is using his grant to further his work mapping the interior of the Earth.


Computing Advances Vital to Sustainability Efforts

National Academy of Sciences (06/29/12) Lorin Hancock

Advances in computing, such as those that help to understand complex systems and their connections, are crucial to solving sustainability problems, according to a recent National Research Council report.  University of California, Los Angeles professor Deborah Estrin says the report “will give us a chance to start creating opportunities for transformative efficiency gains, deep scientific understanding, and informed evolution of the associated political and economic systems.”  The report relies on smart energy grids, sustainable agriculture, and resilient infrastructure to demonstrate the potential impact of advances in computing.  The report recommends working toward these sustainability goals by solving specific problems.  The ultimate goal of applying computer science to sustainability is to inform, support, facilitate, and automate decision making, according to the report.  In addition, the report emphasizes that computer science research in sustainability must be an interdisciplinary effort, with experts in the different fields of sustainability being equal partners in research.


P2P Comes to the Rescue of Internet Video

Europe’s Newsroom (06/13/12)

VTT Technical Research Center of Finland researchers, working with a consortium of 20 industrial partners on the P2P-Next project, have developed NextShare, an open source peer-to-peer (P2P) video-streaming platform.  The researchers designed, implemented, and tested algorithms and protocols to use P2P architecture to stream video.  “The key difference with P2P applications for file sharing is that video data can’t be broken into different packets and sent in any order, it has to be sent in sequence and maintain a certain level of quality of service,” says P2P-Next project coordinator Jari Ahola.  The researchers initially based the system on the BitTorrent protocol, and then developed their own open source protocol called Swift.  The researchers tested the technology by building Swarmplayer, a Web browser plug-in that enables Internet users to access Internet video via the Firefox browser.  The researchers also developed a set-top box to demonstrate how the technology could be incorporated into consumer electronic devices.  The box has “social networking features so, for example, users can view Twitter comment feeds about what they are watching as they watch it,” Ahola says.  The researchers found that the P2P approach cuts bandwidth demands by at least 65 percent compared with the unicast streaming approach.


University of Waterloo Engineers Unveil Two-Way Wireless Breakthrough

University of Waterloo (06/14/12) Carol Truemner

University of Waterloo engineering researchers have developed technology that enables wireless signals to be simultaneously sent and received on a single radio channel frequency.  “This means wireless companies can increase the bandwidth of voice and data services by at least a factor of two by sending and receiving at the same time, and potentially by a much higher factor through better adaptive transmission and user management in existing networks,” says Waterloo professor Amir K. Khandani.  He says two-way wireless technology could lead to huge improvements in voice and data services, and boost wireless networks in terms of quality of service and efficiency.  Moreover, the breakthrough could result in ultra-secure transmission.  “The cost in hardware and signal-processing complexities and antenna size is very low and virtually the same as current one-way systems,” Khandani notes.  New applications for two-way technology include methods for interference management, security enhancements, and unique wireless connectivity called media-based.  The approach involves embedding data in transmission media rather than in the transmitted radio-frequency signal.