• August 2009
    M T W T F S S
    « Jul   Sep »
     12
    3456789
    10111213141516
    17181920212223
    24252627282930
    31  

Your College Gets a Supercomputer! And Yours, and Yours!

Chronicle of Higher Education (08/10/09) Young, Jeffrey R.

There may soon be a supercomputer for every college thanks to the declining assembly costs and growing power of these systems. Monmouth College, for instance, built a homemade supercomputer out of dozens of old high-end computers purchased on eBay for about $200 per unit. Meanwhile, the U.S. government has constructed a series of shared supercomputers that any college can access online. Supercomputers are critical tools for the modeling of complex phenomena, and also are instrumental in many projects that display high-resolution images of data. A 2006 report from the National Science Foundation (NSF) contends that increasing supercomputer access is vital to the maintenance of U.S. research’s competitiveness, and says that “problems of national need” will not be addressed unless more schools and more professors can use supercomputers to tackle their biggest challenges. Last year several colleges initiated a program to spread awareness about supercomputing by designating a professor at each participating campus as a proselytizer and tech-support contact for the NSF-supported TeraGrid supercomputer network. Monmouth professor Christopher G. Fasano, who put together the institution’s supercomputer, is concerned that “a new kind of digital divide” could manifest itself if small colleges do not make appropriate supercomputing investments, and thus fail to draw the best students and researchers. He says that exposure to supercomputing is becoming an essential need for students, especially if they are to go to graduate school in technical disciplines.

-Read more-

Virtual Worlds May Be the Future Setting of Scientific Collaboration

PhysOrg.com (08/04/09) Zyga, Lisa

The California Institute of Technology, Princeton University, Drexel University, and the Massachusetts Institute of Technology have formed the first professional scientific organization based entirely in a virtual world. The Meta Institute for Computational Astrophysics (MICA) conducts professional seminars, popular lectures, and other events entirely online in virtual worlds. MICA is based in Second Life, where participants use avatars to explore and interact with their surroundings, and the organization plans to expand into other worlds when appropriate. MICA also is working to establish collaborative partnerships with the IT industry, including Microsoft and IBM. In addition to uniting people in an innovative, free, and convenient way, virtual worlds can provide new possibilities for scientific visualization, or visual analytics. As data sets become increasingly large and complicated, visualization can help researchers better understand different phenomena. Also, virtual worlds can allow researchers to immerse themselves in data and simulations, helping them think differently about data and patterns.

-Read more-

Intel Taps Facebook Multitudes for Massive Research Efforts

TechNewsWorld (08/04/09) Morphy, Erika

Intel has launched Progress Thru Processors, a new Facebook application that provides users with an interface to donate their unused computer processing time to worthy causes. Progress Thru Processors runs as a background process on the user’s computer and automatically directs idle processing power to the researchers’ computational efforts. When a computer user needs more processing power the application automatically switches to idle mode. Intel is working with GridRepublic, a nonprofit organization that provides spare processing power donated by volunteers to projects in need of computing resources. The desktop client and application used in Progress Thru Processors are based on software developed by the BOINC project at the University of California, Berkeley, which was funded by the National Science Foundation. Intel’s John Cooney says the application is unique in that it makes volunteer computing more attainable and user friendly. GridRepublic executive director Matt Blumberg says adding a viral element could help build awareness for volunteer computing. “By providing a widget that makes it easy for people to participate and that let’s them update their friends, I would like to think that all or most of Facebook’s users will at least become aware of it,” Blumberg says. Initially, Progress Thru Processors will focus on only three research efforts, but Cooney says more may be added if the program proves popular. “We wanted a good cross section of project areas, but for our launch we needed to start out with a finite number,” he says.

–Read More–

Inexpensive Parallel Processing: Programming Tools Facilitate Use of Video Game Processors for Defense Needs

GPU

Georgia Tech researchers have reprogrammed the Vector, Signal and Image Processing Library (VSIPL) to run on graphics processing units (GPUs), such as the one shown here. (Georgia Tech Photo: Gary Meek)

Georgia Tech Research Institute (07/31/09) Englehardt, Kirk J.; Toon, John

Researchers at the Georgia Institute of Technology Research Institute (GTRI) are developing a programming tool that will enable defense industry engineers to use graphics processing units (GPUs) without having to learn to program them. “As radar systems and other sensor systems get more complicated, the computational requirements are becoming a bottleneck,” says GTRI researcher Daniel Campbell. “We are capitalizing on the ability of GPUs to process radar, infrared sensor, and video data faster than a typical computer and at a much lower cost and power than a computing cluster.” Georgia Tech professor Mark Richards is working with Campbell and graduate student Andrew Kerr to rewrite common signal-processing commands to run on a GPU. The researchers are writing functions defined in the Vector, Signal, and Image Processing Library (VSIPL), an open standard. Currently, the researchers are writing the functions in Nvidia’s CUDATM language, but the underlying principles can be applied to GPUs developed by other companies, Campbell says. The resulting GPU VSIPL will enable engineers to use high-level functions in C programs to perform linear algebra and signal-processing operations, and recompile with GPU VSIPL to capitalize on the speed of GPUs. In the future, the researchers plan to develop other defense-related GPU function libraries and design programming tools to allow for the use of other highly-efficient processors, such as the Cell broadband engine processor used in the Playstation 3 game console.

To read more:

http://www.gtri.gatech.edu/news/programming-tools-facilitate-use-video-game-proces

Time for Computer Science to Grow Up

Communications of the ACM (Vol. 52 No. 8), August 1

Engineering and computer science professor Lance Fortnow argues that the computer science field, now more than 50 years old, needs to adapt to a conference and journal model that has worked well for every other academic field. Fortnow argues that the current conference system forces researchers to focus too heavily on quick, technical and safe papers instead of considering broader and newer ideas. Meanwhile, academics devote too much time and money to conferences where they can present papers, leaving them less time to attend events where they can socialize with colleagues. Fortnow first highlights the reasons why conferences have become more prestigious than journals as a way to disseminate computing research results and then explains why the current system is no longer appropriate for the demands of the computer science discipline. In the final analysis, academic journals should be the place to rate papers and researchers, while conferences should act as a broad forum to bring the computer science community together.

After reviewing how the conference and journal model for computer science came into being, Fortnow describes why it has led to a splintering of the computer science community. Because of limitations of money and time, very few conferences draw many attendees beyond the authors of accepted papers. Conferences now serve the journal role of other fields, leaving nothing to serve the proper role of conferences. Moreover, having different publication procedures discourages proper collaboration between researchers in computer science and other fields. The final result is a situation in which too many papers chase too few conference slots. To maintain their quality reputations, conference program committees have become biased toward safe papers (incremental and technical) versus those that explore new models and research directions outside the established core areas of the conference.

So how do we move a field with a long tradition of conference publications to a more journal-based system? Fortnow suggests that computer science needs organizations like ACM and the CRA to help break the inertia in the system and support the growth of a strong journal publication system. While conferences were preferable to print publications for several reasons during the 1990s, the growth in computer science and advances in Web technology has changed everything. For example, quick dissemination via the Web makes time to print less relevant. In the opinion of Fortnow, leaders and sponsors of major conferences must make the first move, holding their conferences less frequently and de-emphasizing their publication role in favor of more community interaction.

To read more, please follow the link below:

http://cacm.acm.org/magazines/2009/8/34492-time-for-computer-science-to-grow-up/fulltext

U.S. supercomputing lead rings Sputnik-like alarm for Russia

Computerworld – Russia’s launch of Sputnik in 1957 triggered a crisis of confidence in the U.S. that helped drive the creation of a space program. Now, Russia is comparing the U.S.’s achievements in supercomputing with theirs, and they don’t like what they see.

To read more:

http://www.computerworld.com/s/article/9136005/U.S._supercomputing_lead_rings_Sputnik_like_alarm_for_Russia

Seeking efficiency, scientists run visualizations directly on supercomputers

seekingeffic(PhysOrg.com) — If you wanted to perform a single run of a current model of the explosion of a star on your home computer, it would take more than three years just to download the data. In order to do cutting-edge astrophysics research, scientists need a way to more quickly compile, execute and especially visualize these incredibly complex simulations.


To read more click the link below:

http://www.physorg.com/news168188994.html