• April 2013
    M T W T F S S

Future Challenges of Large-Scale Computing

HPC Wire (04/15/13)

NVIDIA chief scientist Bill Dally says in an interview that similar  processor requirements in high-performance computing, Web servers, and big data  will lead to a convergence on heterogeneous multicore processors where each  socket will feature a small number of cores optimized for latency and many more cores optimized for throughput. Dally predicts that three-dimensional stacked chip technology will be essential to the extension of high-bandwidth on-package memory capacity. With budget austerity likely to cut U.S. government investments in exascale computing, Dally projects that industry will continue to move ahead in this field on its own, although at a much slower pace. He also is hopeful that the challenge of achieving sustained exaflops on a real application in 20 MW will be met, thanks to numerous emerging circuit, architecture, and software technologies that could potentially enhance the energy efficiency of one or more parts of the system. Dally perceives energy efficiency and programmability as the two biggest challenges to reaching exascale. He notes that research projects are underway to devise more productive programming systems and the tools that will enable automated mapping and tuning.


The Rise of Big Data

Foreign Affairs (06/13) Kenneth Neil Cukier; Viktor Mayer-Schoenberger

Big data is transforming the way people experience the world and enabling them to learn things that in the past would have been impossible, offering a potential that could rival that of the Internet. This phenomenon is relatively new because as recently as 2000 only a quarter of stored information was digital, compared with today’s figure of more than 98 percent. Big data does not merely refer to a quantity of information, but also the ability to turn previously unquantified information into data. This “datafication” of the world, combined with cost-effective computer memory, powerful processors, smart algorithms, and improved software, is driving efforts to provide enough data to a computer to enable it to infer the probability of an event, which is taking the place of trying to teach a computer to complete a task. Making use of big data requires three major shifts in approach to data. First, using big data requires the collection and use of a large amount of data rather than small amounts or samples. Second, imperfect data is acceptable because a huge volume of data of variable quality produces greater results than a small quantity of clean, exact data. Finally, the focus with big data should be in finding correlations rather than causes.


HTC, Big Data and the God Particle

HPC Wire (29/03/13) Miha Ahronovitz

Contributor Miha Ahronovitz traces the history of high throughput computing (HTC), noting the particularly enthusiastic response from the high energy physics world and the role of HTC in such important discoveries as the Higgs boson. As one of the biggest generators of data, this community has been dealing with the “big data” deluge long before “big data” assumed its position as the buzzword du jour.