• April 2011
    M T W T F S S

Cloud Computing, Data Policy on Track to ‘Democratize’ Satellite Mapping.

South Dakota State University (03/24/11)

New U.S. Geological Survey data policies and advances in cloud computing are leading to the democratization of satellite mapping, which could lead to wider access to information about the earth through platforms such as the Google Earth Engine. “This is an incredible advantage in terms of generating the value-added products that we create for quantifying deforestation, natural hazards, cropland area, urbanization, you name it,” says South Dakota State University (SDSU) professor Matt Hansen. He says free satellite images, coupled with the cloud computing capability offered by Google and similar organizations, is making it possible for ordinary users to analyze satellite imagery without costly hardware. Hansen and SDSU postdoctoral researcher Peter Potapov collaborated with Google to help process more than 50,000 images in order to generate a detailed map of Mexico to demonstrate the technology’s potential. Enhanced publicly available processing tools will democratize satellite data processing as more people become engaged in working with the data. However, Hansen notes that this will entail greater collaboration between academics, government scientists, and private industry in processing and characterizing the satellite data sets.


Researchers in Taiwan to Use Volunteer Computing to Visualize Earthquakes.

AlphaGalileo : E-Science Talk (03/28/11)

Researchers in Taiwan have set up Shakemovie@home in an attempt to reduce the amount of time it takes to create animations that simulate the motion of earthquakes. Shake movies take several hours to create because intensive calculations need to be performed on the models of earthquakes as well as the earth’s structure. However, with Shakemovie@home, researchers at the Institute of Earth Sciences at Academia Sinica will only use the computers of volunteers to retrieve essential functions that depend on the earth’s model. Researchers will compute, save, and store these elements, called Green’s functions, in advance, and retrieve them as they are needed. The retrieval process will be farmed out to volunteer computers. Researchers will be able to make a new shake movie in just minutes because they will not have to calculate Green’s functions every time. “By distributing this task to volunteers, to computers at home, we can get a better and faster way of making shake movies,” says Academia Sinica professor Li Zhao. “Now we have shake movies in a few hours, but with volunteer computing we could have it in minutes.”


Multicore Coding Standards Aim to Ease Programming.

IDG News Service (03/29/11) Agam Shah

The Multicore Association has established specifications for a programming model designed to make it easier to write software for multicore chips, particularly for those used in smartphones, tablets, and embedded systems. The association is developing a set of foundation application programming interfaces (APIs) to standardize communication, resource sharing, and virtualization. The association has completed the multicore communication API (MCAPI) and the multicore resource API (MRAPI), and is working to develop more tools and APIs involving virtualization. “The primary goal for all parties is to establish portability,” says Multicore Association president Markus Levy. He says that a consistent programming model will make it easier to reuse applications on different platforms. “By using MCAPI, the embedded applications code does not need to be aware of the inter-core communications method,” says Mentor Graphics’ Colin Walls. MCAPI allows programmers to enable applications for multicore once and reuse that code on multiple products in a product line and for next-generation devices, says PolyCore Software CEO Sven Brehmer. MCAPI will be used in telecom and data communications infrastructures, in addition to medical devices, high-performance computing, and military and aeronautics equipment, Brehmer says.