• June 2013
    M T W T F S S

MOOC Provider edX More Than Doubles Its University

Chronicle of Higher Education (05/21/13) Jeffery R. Young

EdX announced that 15 additional universities have agreed to offer free  massive open online courses, bringing the total membership to 27 institutions. The new partners include five universities in the United States, six in Asia, three in Europe, and one in Australia. EdX, a nonprofit provider of MOOCs founded by Harvard University and the Massachusetts Institute of Technology, aims to help colleges use technology to rethink campus education and deliver  online courses. “What we hope to get out of our partnership with edX is actively learning from and building upon each other’s educational innovations,” says  Kyoto University professor Toru Iiyoshi. Several professors recently have raised  questions about the implications of free online courses, especially as colleges run pilot projects in which they ask students to watch edX video lectures and  use edX professors. “It’s a good thing that people are debating and discussing  all the issues of this transformational technology,” says edX president Anant Agarwal. “The way we look at it is this is increasing choice.” Agarwal notes there currently are more than 900,000 people enrolled in edX programs.


A master’s-level computer science degree, delivered via MOOCs

Summary: Massive open online courses will soon deliver an advanced comp-sci degree at a very, very low price, courtesy of Georgia Tech, Udacity and AT&T.

The disruption of the economics of higher education is providing new opportunities to refresh and expand IT skills at little or no cost. It couldn’t come at a better time for professionals worried about falling behind, or for organizations scrambling to find skills for a deeper move into the digital realm. The Georgia Institute of Technology, College of Computing, has said that it will be offering the first Online Master of Science degree in computer science (OMS CS) that can be earned completely through the massive open online course (MOOC) format. The degree will be provided via the Udacity MOOC platform, with support from AT&T.


Running Stochastic Models on HTCondor

HPC Wire (05/30/13) Ian Armas Foster

Research by Brigham Young University’s Spencer Taylor applied the open source HTCondor software to a water resource model called Gridded Surface Subsurface Hydrologic Analyst (GSSHA), which requires computationally intensive stochastic functions found in many scientific fields. HTCondor executes jobs on a local network by harnessing computing power from idle systems. The resulting tests demonstrate that HTCondor is viable as an alternative to obtaining extra high-performance computing (HPC) resources for mid-level research institutions. The purpose of the project was to enable such institutions to combine their computing base with existing HPC resources both on site and in the cloud. “We found that performing stochastic simulations with GSSHA using HTCondor system significantly reduces overall computational time for simulations involving multiple model runs and improves modeling efficiency,” Taylor contends. He says HTCondor’s inherent nature required each stochastic model to run on a different number of processors, ranging from roughly 80 to 140. “As expected, with about 100 times the computational power of normal circumstances, I was able to essentially reduce the runtime by factor of 100,” Taylor reports. He also cites the possibility of utilizing commercial cloud resources as part of the HTCondor base.


Hip-Hip-Hadoop: Data Mining for Science

Texas Advanced Computing Center (05/24/13) Aaron Dubrow

In 2010, the Texas Advanced Computing Center (TACC) at the University of Texas at Austin began experimenting with Hadoop to test the technology’s applicability for scientific problems. TACC researchers won a Longhorn Innovation for Technology Fund (LIFT) grant to build a Hadoop-optimized cluster on Longhorn. “The LIFT grant let us add local drives and storage to enable researchers to do experimental Hadoop-style studies on a current production system,” says TACC’s Weijia Xu. The system enables researchers to run 48 eight-processor nodes on TACC’s Longhorn cluster for Hadoop in a coordinated way with accompanying large-memory processors. Intel also has been working with TACC to assess the impact of new hardware the company has developed on the performance of Hadoop applications. For example, researchers at Intel and TACC recently described experiments using Intel’s 10GBASE-T network adapters on Hadoop. Xu currently is applying data-mining and machine-learning techniques to study health communication. “While connecting users to those whom they may never be able to connect to otherwise, online communities present a new information environment that does not operate under the old publishing paradigm,” says University of Texas researcher Yan Zhang. “This creates new challenges for users to access and evaluate information.”