• September 2016
    M T W T F S S
    « Jul    

Faster Parallel Computing

MIT News (09/13/16) Larry Hardesty

Researchers from the Massachusetts Institute of Technology’s (MIT) Computer Science and Artificial Intelligence Laboratory (CSAIL) this week are presenting Milk, a new programming language, at the 25th International Conference on Parallel Architectures and Compilation Techniques in Haifa, Israel. With Milk, application developers can handle memory more efficiently in programs that manage scattered datapoints in large datasets. Tests on several common algorithms showed programs written in Milk topped the speed of those written in existing languages by a factor of four, and the CSAIL researchers think additional work will boost speeds even higher. MIT professor Saman Amarasinghe says existing memory management methods run into problems with big datasets because with big data, the scale of the solution does not necessarily rise in proportion to the scale of the problem. Amarasinghe also notes modern computer chips are not optimized for this “sparse data,” with cores designed to retrieve an entire block of data from main memory based on locality, instead of individually retrieving a single data item. With Milk, a coder inserts a few additional lines of code around any command that iterates via a large dataset looking for a comparatively small number of items. The researchers say Milk’s compiler then determines how to manage memory accordingly.


Revealed: Google’s Plan for Quantum Computer Supremacy

New Scientist (08/31/16) Jacob Aron

Google expects to have the world’s largest working quantum computer ready soon, as researchers say the company is on the verge of a breakthrough. Hints were dropped in July when Google published a study in which it announced a plan to achieve “quantum supremacy” by building the first quantum computer that can perform a task beyond the capabilities of classic computers. Google publicly announced a 9-quantum-bit (qubit) system, but its goal is a 50-qubit computer that can model the behavior of a random arrangement of quantum circuits. “They’re doing a quantum version of chaos,” says Simon Devitt at Japan’s RIKEN Center for Emergent Matter Science. After pushing classical computing to its limit in the simulation of quantum circuit behavior on the Edison supercomputer to set the goal it hopes to achieve, Google hired University of California, Santa Barbara professor John Martinis to design superconducting qubits. Devitt thinks quantum supremacy could be achieved by the end of 2017, although meeting the challenge even within the next five years would still be a major accomplishment. Building a 50-qubit quantum device would be the first step toward a fully scalable machine, which Devitt says will indicate the technology is “ready to move out of the labs.”


Transistors Will Stop Shrinking in 2021, Moore’s Law Roadmap Predicts

IEEE Spectrum (07/22/16) Rachel Courtland

The 2015 International Technology Roadmap for Semiconductors (ITRS) predicts the transistor could stop shrinking in only five years. The report predicts that after 2021, it will no longer be economically feasible for companies to continue to shrink the dimensions of transistors in microprocessors. Transistor miniaturization was still a part of the long-term forecast as recently as 2014, but three-dimensional (3D) concepts have gained momentum. A company could continue to make transistors smaller well into the 2020s, but the industry wanted to send the message that it is now more economic to go 3D, says ITRS chair Paolo Gargini. In the years before 3D integration is adopted, ITRS predicts leading-edge chip companies will seek to boost density by turning the transistor from a horizontal to a vertical geometry and building multiple layers of circuitry, one on top of another. The report also predicts the traditional silicon channel will be made with alternative materials. The changes will enable companies to pack more transistors in a given area, but keeping to the spirit of Moore’s Law is another matter.


Texas Goes Big With 18-Petaflop Supercomputer

Computerworld (06/02/16) Patrick Thibodeau

The Stampede 2 supercomputer to be set up at the Texas Advanced Computer Center (TACC) is designed to replace and approximately double the performance of Stampede, its 9-petaflop predecessor. The new system, which is scheduled to be available for research by next June, is being funded by a $30-million grant from the U.S. National Science Foundation. Stampede 2 will utilize Dell servers and Intel chips, and TACC also is upgrading Stampede with the addition of 500 Knights Landing-based Xeon Phi systems, which can support as many as 72 cores to raise its aggregate performance above 10 petaflops. The new supercomputer will incorporate 3D XPoint non-volatile memory technology, which is 1,000 times faster than NAND flash. “We anticipate [Stampede 2] will be the biggest machine in a U.S. university by next year,” says TACC executive director Dan Stanzione. He notes although Stampede has managed 7 million jobs since its inception, TACC still gets five times as many requests for time on the system as it can deliver. Stanzione says Stampede 2 will help fulfill this backlog, while higher resolutions and more accurate modeling for large runs will be among its advantages, along with faster completion times for smaller jobs. TACC says Stampede and Stampede 2 will use about the same number of nodes, and it expects each of the 6,000 nodes to be capable of approximately 3 teraflops.


Report Says Computer Science Should Be Treated Like Core Science Subjects in K-12 Classrooms

Education World (06/01/16) Nicole Gorman

Only a fraction of U.S. schools offer computer science and most lack the ability to teach students the core principles of the subject, according to a new report from the Information Technology & Innovation Foundation (ITIF). The report says curriculum and standards focus on using, instead of understanding, technology. The study notes there should be significant changes to how computer science is taught in grades K-12, considering how in-demand computer science majors are and will be in the future. “In 2011, Code.org projected that the economy would add 1.4 million computing jobs by 2020, but educate just 400,000 computer science students by then,” the study says. ITIF says a curriculum overhaul would optimize students’ success. “To maintain the field’s current momentum, the perception of computer science needs to shift from its being considered a fringe, elective offering or a skills-based course designed to teach basic computer literacy or coding alone,” the report says. First and foremost, the report recommends the U.S. train and develop 10,000 additional teachers to teach computer science. There also needs to be a focus on creating innovative education policy that favors teaching computer science principles in both K-12 and university classrooms.


U.S. Gets Warnings and Advice About the Internet of Things

Computerworld (06/07/16) Patrick Thibodeau

In response to a U.S. Department of Commerce request for comments about the Internet of Things’ (IoT) potential, industry groups including ACM’s U.S. Public Policy Council and others submitted more than 130 reports detailing positive and negative aspects, which will form the basis of a green paper (a tentative government report). Booz Allen Hamilton stresses the workforce would need to adapt to a reality in which some jobs will become redundant, while demand for others will grow. Trends the consultancy forecasts include a need for data specialists “to analyze and remove noise from data, and privacy officers will need to analyze vulnerabilities and evolve policies.” Booz Allen also envisions crowdsourcing becoming more popular as a means for corporations “to access top talent on demand,” while emotional intelligence, creativity, “and the ability to deduce meaning from information” will be sought. Meanwhile, the American Bar Association warns of the IoT’s potential scale becoming so vast that responding to a “disabling attack” could exceed “the capacity of any application vendor, the largest global device manufacturers, a self-help community within an industrial sector, or even national governments to address.” ACM cautions although vendors can secure individual devices, once users start building devices into a “composable” infrastructure, the certainty fades that any safeguards will remain. The Electronic Privacy Information Center notes businesses could gain valuable commercial insights about customers with the IoT.


Google Moves Closer to a Universal Quantum Computer

Nature (06/08/16) Philip Ball

A research team has made an experimental prototype of a universal quantum computer that can solve a wide range of problems in fields such as chemistry and physics, and has the potential to be scaled up to larger systems. The Google prototype combines the best of analog and digital approaches to quantum computing. Google computer scientists and physicists at the University of California, Santa Barbara (UC Santa Barbara) and the University of the Basque Country in Bilbao, Spain, used a row of nine solid-state quantum bits (qubits) fashioned from cross-shaped films of aluminum about 400 micrometers from tip to tip, deposited them onto a sapphire surface, and cooled the metal to turn it into a superconductor with no electrical resistance. Information could be encoded into the qubits in their superconducting state, and interactions between neighboring qubits were controlled by logic gates that steer the qubits digitally into a state that encodes the solution to a problem. The researchers say their approach should enable a computer with quantum error correction, and they predict devices with more than 40 qubits could be a reality in a couple of years. “At that point, it will become possible to simulate quantum dynamics that is inaccessible on classical hardware, which will mark the advent of ‘quantum supremacy,'” says UC Santa Barbara’s Daniel Lidar.