• November 2012
    M T W T F S S
    « Oct   Dec »

World’s Top Supercomputer Simulates the Human Heart

Popular Mechanics (10/22/12) Kathryn Doyle

Lawrence Livermore National Laboratory (LLNL) researchers say they used Sequoia, currently ranked as the world’s most powerful supercomputer, to develop the fastest computer simulation of the human heart.  The researchers used a highly scalable program called Cardioid to model the electrical signals traveling from cell to cell.  Cardioid treats each cell like a unit.  “The processes within a cell are captured in a set of 19 ordinary differential equations, so we can’t get inside that because they’re treated as a single entity,” says LLNL’s Fred Streitz.  The researchers say this process should enable them to investigate the competing theories for how cells are arranged in the heart.  The Cardioid model works well for researching arrhythmia because the system’s longer run time allows researchers to simulate the introduction of an anti-arrhythmic drug into the bloodstream, seeing the point when drug levels spike and when they drop off.  “The details that differentiate individual hearts can be very fine, and our ability to model at extraordinarily high resolution, currently a factor of eight greater than previously, that allows us to capture very fine differences,” says LLNL’s Dave Richards.


Breakthrough Offers New Route to Large-Scale Quantum Computing

Princeton University (10/19/12) John Sullivan

Princeton University researchers have developed a method that allows for the quick and reliable transfer of quantum information throughout a computing device, and could help engineers build a quantum computer that consists of millions of qubits.  “The whole game at this point in quantum computing is trying to build a larger system,” says Princeton professor Andrew Houck.  The researchers used a stream of microwave photons to analyze a pair of electrons trapped in a quantum dot.  “The microwaves are affected by the spin states of the electrons in the cavity, and we can read that change,” says Princeton professor Jason Petta.  The method combines techniques from materials science and optics.  To make the quantum dots, the researchers isolated a pair of electrons on a small section of material known as a semiconductor nanowire, and then created small electron cages along the wire.  The researchers found that electrons of similar spin will repel, while those of different spins will attract.  “The methods we are using here are scalable, and we would like to use them in a larger system,” Petta says.


Using Big Data to Save Lives

UCR Today (10/22/12) Sean Nealon

University of California, Riverside researchers have developed a method for mining data derived from pediatric intensive care units to help doctors treat children and cut health care costs.  “This data has the potential to be a gold mine of useful–literally life saving–information,” says Riverside professor Eamonn Keogh.  He notes that modern pediatric care units are equipped with a variety of sensors that record up to 30 measurements.  The researchers developed a technique that makes it possible to search the sensor datasets, which can contain more than one trillion objects.  The researchers also are exploring ways to capture and store data from five or more sensors, and capture multiple data points per second.  In the next few years, the researchers plan to study archived pediatric intensive care unit data to find common patterns that can help doctors in diagnosing and predicting medical episodes.  The researchers also want to incorporate those patterns into intensive care unit sensors.  However, the difficulty is in finding medically useful patterns because there are an infinite number of trivial patterns.  “We have to find those that aren’t known but are useful and that can benefit from intervention,” Keogh says.


A Bandwidth Breakthrough

Technology Review (10/23/12) David Talbot

Academic researchers have developed coded Transmission Control Protocol, a method for improving wireless bandwidth by one order of magnitude that involves using algebra to overcome the network-clogging task of resending dropped packets.  The technology provides new ways for mobile devices to solve for missing data, and can combine data streams from Wi-Fi and LTE.  Massachusetts Institute of Technology researchers tested the system on standard Wi-Fi networks, where 2 percent of packets are normally lost, and found that a normal bandwidth of 1 Mbps was boosted to 16 Mbps.  The researchers also tested the technology in the Amazon cloud.  Internet Protocol traffic was sent to Amazon, encoded, and then decoded as an application on phones.  The technology sends algebraic equations that describe a series of packets.  That way, if a packet is lost, the receiving device can solve for the missing packet instead of asking the network to resend it.  If the technology works in large-scale deployments as the researchers expect, it could help delay a spectrum bottleneck.


Faster Chips ‘Cut Cloud-Computing Bills’

BBC News (10/23/12)

Researchers at Deutsch Telekom Laboratories and Aalto University have found that customers of Amazon’s EC2 cloud service do not receive the same level of performance.  Amazon says it uses generic hardware, but the team used tools to examine the software that controls the groups of servers customers rent, and was able to identify the chip at the heart of each server in a group or instance of computers.  Measurements taken over the course of a year revealed instances running newer, faster chips, and they were much faster than clusters that used older hardware.  “In general, the variation between the fast instances and slow instances can reach 40 percent,” the researchers wrote in a paper, noting that for some applications the newer clusters worked about 60 percent faster.  The faster instances would enable users to reduce their server bills by up to 30 percent because the newer machines are able to crunch data faster.  The team is now working on tools that can determine the performance characteristics of particular clusters and push work to more powerful groups.


Federal Budget Limits Affect Scientific Conferences

New York Times (10/23/12) Laura Dattaro

The Obama administration recently imposed new guidelines that limit the amount of money federal agencies can spend on regional conferences.  However, several science and technology organizations say the federal budget limits are negatively affecting the scientific community’s ability to share research and collaborate.  “This is a problem not just for the computing research community, but for almost anyone who’s involved in scientific work,” says ACM president Vinton G. Cerf.  Many organizations have written to Congress and federal officials asking for an exemption from the spending policy for recognized scientific, technical, and educational meetings, as well as meetings of national and international standards bodies.  “The inability of the government researchers and program managers to participate in these conferences is actually very damaging,” Cerf says.  The new policy also has created confusion among federal agencies, because individuals do not know how many other agency employees also are attending, and do not know if the agency is close to its federally enforced $100,000 spending limit.  Cerf notes that in an era of high unemployment and shifting markets, scientific collaboration is crucial, and limiting those opportunities is not good for the United States.


On Track for Terabyte Discs: Making Computer Data Storage Cheaper, Easier

Case Western Reserve University (10/09/12)

Case Western Reserve University researchers have developed technology that could lead to the creation of optical discs that hold up to two terabytes of data.  The researchers say the discs would provide small and medium-sized businesses with an alternative to storing data on energy-wasting magnetic disks or large magnetic tapes.  The technique uses data storage technology similar to Blu-ray, but instead of packing more data on the surface, the data is written in dozens of layers.  The method is based on optical film with 64 data layers, technology that was fist developed by the Case Western Reserve’s Center for Layered Polymeric Systems.  The researchers then cut and pasted the film onto the same hard plastic base DVDs and Blu-ray disks use.  The researchers want to provide an affordable option to computer centers that now regularly purge data due to the prohibitive costs of current storage technologies.  “A disc will be on the capacity scale of magnetic tapes used for archival data storage,” says Case Western Reserve professor Kenneth Singer.  “But, they’ll be substantially cheaper and have one advantage: You can access data faster.  You just pop the disc in your computer and you can find the data in seconds.”