• July 2020
    M T W T F S S
     12345
    6789101112
    13141516171819
    20212223242526
    2728293031  

Report Says Computer Science Should Be Treated Like Core Science Subjects in K-12 Classrooms

Education World (06/01/16) Nicole Gorman

Only a fraction of U.S. schools offer computer science and most lack the ability to teach students the core principles of the subject, according to a new report from the Information Technology & Innovation Foundation (ITIF). The report says curriculum and standards focus on using, instead of understanding, technology. The study notes there should be significant changes to how computer science is taught in grades K-12, considering how in-demand computer science majors are and will be in the future. “In 2011, Code.org projected that the economy would add 1.4 million computing jobs by 2020, but educate just 400,000 computer science students by then,” the study says. ITIF says a curriculum overhaul would optimize students’ success. “To maintain the field’s current momentum, the perception of computer science needs to shift from its being considered a fringe, elective offering or a skills-based course designed to teach basic computer literacy or coding alone,” the report says. First and foremost, the report recommends the U.S. train and develop 10,000 additional teachers to teach computer science. There also needs to be a focus on creating innovative education policy that favors teaching computer science principles in both K-12 and university classrooms.

MORE

Chameleon: Why Computer Scientists Need a Cloud of Their Own

HPC Wire (05/05/16) Tiffany Trader

The U.S. National Science Foundation-funded Chameleon cloud testbed in less than a year of operation has contributed to innovative research in high-performance computing (HPC) containerization, exascale operating systems, and cybersecurity. Chameleon principal investigator Kate Keahey, a Computation Institute fellow at the University of Chicago, describes the tool as “a scientific instrument for computer science where computer scientists can prove or disprove hypotheses.” Co-principal investigator Dan Stanzione, executive director of the Texas Advanced Computing Center at the University of Texas at Austin, says Chameleon can meet the oft-denied request from the software or computer science research community to make fundamental changes to the way the machine operates. With Chameleon, users can configure and test distinct cloud architectures on various problems, such as machine learning and adaptive operating systems, climate modeling, and flood prediction. Keahey says support for research at multiple scales was a key design element of the instrument. One project using Chameleon involves comparing performance between containerization and virtualization as they apply to HPC applications. Keahey says it is “a good example of a project that really needs access to scale.” Another major Chameleon user is the Argo Project, an initiative for designing and prototyping an exascale operating system and runtime.

MORE

 

The Internet Was Delivered to the Masses; Parallel Computing Is Not Far Behind

Virginia Tech News (08/21/14)

Virginia Polytechnic Institute and State University (Virginia Tech) professor Wu Feng has been a leader in the field of parallel computing, especially in relation to the field of biomedicine. In the mid-2000s, Feng worked on a multi-institutional project to combine the capabilities of supercomputers at six U.S. institutions in an early example of parallel computing in the cloud. Feng recently has brought together funding from a wide range of sources, including the U.S. National Science Foundation and the National Institute of Health, to pursue research into ways of making the power of parallel computing more readily available to everyone. He says this has become increasingly important as the rate at which data is being generated dramatically outstrips the pace of advances in computing power. Feng says to handle the vast and ever-growing amounts of data generated in a number of fields and industries requires access to the power of parallel computing. To that end, Virginia Tech is establishing a new research center: Synergistic Environments for Experimental Computing. The center will focus on multidisciplinary efforts to design new algorithms, software, and hardware focusing on five areas of synergistic computing: the intersection of computer and physical systems, the health and life sciences, business and financial analytics, cybersecurity, and scientific simulation.

MORE

High-Performance Data Replication Across Cloud Servers

Phys.Org (06/24/14)

Computer scientists in China have developed a system that can provide high-performance data replication across cloud servers. The approach keeps data in sync, which is a major problem facing widespread adoption of cloud services. Earlier attempts to limit data conflicts for users accessing many different servers across the globe lack scalability and are slow, according to Anirban Kundu, Lin Luan, and Ruopeng Liu of the Kuang-Chi Institute of Advanced Technology in Shenzhen. They say their approach to cloud synchronization is so efficient that only low-category cloud servers are needed for implementation. The cloud server-side environment has a network structure with distinct levels for controlling data transfer in different segments based on the user/system requirements and ferrying information packets through dedicated channels being used in parallel as required. Parallel execution takes less time to finish a particular task by dividing it into several asynchronous activities on different machines using a distributor/scheduler and then bringing these individual parts back together in the appropriate order for the end user or output.

MORE

12 Predictions for the Future of Programming

InfoWorld (02/03/14) Peter Wayner 

Forecasting the next hit technology is challenging due to the rapid pace of change, but InfoWorld offers 12 predictions for the future of programming over the next five years. One prediction is that the graphics processing unit (GPU) will take center stage from the central processing unit, as applications are increasingly written to use the parallel architecture of GPUs. In addition, databases are likely to manage increasingly advanced analysis, running more sophisticated algorithms on tables, searching more efficiently for data patterns, and carrying out big data tasks. Javascript and Android will grow more and more dominant in programming. The Internet of Things will increase the number of new programming platforms, with automobiles being the most significant. Open source code will continue to present a challenge in finding the revenue to support development. WordPress Web applications will proliferate, as dominant frameworks offer functionality that makes it unnecessary to create apps from scratch. Plug-ins will increasingly supplant full-fledged programs. The command line will remain relevant because it is too flexible and universal to be replaced, and it will continue to work with modern tools. Both outsourcing and insourcing will continue, as work is outsourced to reduce costs but also performed by new automated tools.

MORE

We the Internet: Bitcoin Developers Seed Idea for Bitcloud

Phys.Org (01/27/14) Nancy Owano 

Bitcoin developers want to decentralize the current Internet and replace it with a new Internet. The Bitcloud developer group has proposed a peer-to-peer system for sharing bandwidth, and enabling individual users to complete computing tasks such as routing, storing, or computing in exchange for cash. The Bitcloud approach would enable nodes on a mesh network to be rewarded financially for routing traffic in a brand-new mesh network. The researchers note their cash model would eliminate the need for Internet service providers. The proposal is based on the ideas of Bitcoin, Mediaglobin, and Tor, and the researchers are looking for more developers to assist with the project. They point out that the initiative demands a “massive amount of thought and development in many different parts of the protocol, so we need as many people helping as possible.” Cloudcoins would be the system’s currency, and would operate similar to bitcoins as the currency of the Bitcoin protocol. “If you’re interested in privacy, security, ending Internet censorship, decentralizing the Internet, and creating a new mesh network to replace the Internet, then you should join or support this project,” according to a recent appeal on Reddit.

MORE

Researchers Implement HPC-First Cloud Approach

HPC Wire (01/29/14) Tiffany Trader 

North Carolina State University researchers have demonstrated a proof-of-concept for a novel high-performance cloud computing platform by merging a cloud computing environment with a supercomputer. The implementations show that a fully functioning production cloud computing environment can be completely embedded within a supercomputer, allowing users to benefit from the underlying high-performance computing hardware infrastructure. The supercomputer’s hardware provided the foundation for a software-defined system capable of supporting a cloud computing environment. This “novel methodology has the potential to be applied toward complex mixed-load workflow implementations, data-flow oriented workloads, as well as experimentation with new schedulers and operating systems within an HPC environment,” the researchers say. The software utility package, Kittyhawk, serves as a provisioning engine and offers basic low-level computing services within a supercomputing system. Kittyhawk is what allowed the researchers to construct an embedded elastic cloud computing infrastructure within the supercomputer. The HPC-first design approach to cloud computing leverages the “localized homogeneous and uniform HPC supercomputer architecture usually not found in generic cloud computing clusters,” according to the researchers. This type of system has the potential to support multiple workloads, including traditional HPC simulation jobs, workflows that involve both high-performance computing and non-high performance computing analytics, and data-flow orientation work.

MORE

MIT Debuts Online Big Data Course for Tech Pros

 Network World (01/09/14) Ann Bednarz

The Massachusetts Institute of Technology (MIT) will offer an online big data course for technology professionals as part of its new lineup of Online X professional programs.  The course, “Tackling the Challenges of Big Data,” will run from March 4 to April 1, and will cover data collection from smartphones, sensors, and the Web.  The course also will address data storage and processing, including scalable relational databases, Hadoop, and Spark; analytics such as machine learning, data compression, and efficient algorithms; visualization, and a range of applications.  MIT will use the Open edX platform to deliver the course, which will include learning assessments, case studies, discussion forums, and a community wiki as part of the experience.  Faculty members from the Computer Science and Artificial Intelligence Laboratory will teach the course.  Participants will receive an MIT Professional Education certificate for successfully completing the course, and will gain access to the group’s professional alumni network.

MORE

Data Scientists: IT’s New Rock Stars

Network World (01/03/14) Colin Neagle

Data scientists are emerging as some of the most sought-after professionals in today’s technology job market.  Technology and media firms are particularly interested in hiring data scientists, and the field is garnering increasing media attention.  Industry observers advise students planning to enter the IT field to pursue data science.  The Harvard Business Review in October 2012 labeled data scientist as “the sexiest job of the 21st century.”  In addition, data science was spotlighted in a recent American Journalism Review profile of Buzzfeed data science director Ky Harlin, who is responsible for the company’s viral content insights and developed his own algorithms to determine when and why specific pieces of Web content go viral.  Harlin learned his skills at a medical-imaging company, and was recruited by Buzzfeed founder Jonah Peretti.  He notes that both the medical-imaging and content-publishing fields look for patterns in vast data sets.  “This is where you add real business value,” Peretti says, “where an IT person is not just running machines anymore, but fundamentally taking good information and helping the business make true business decisions so that they can adjust the business in real time based on this information.”

MORE

Supercomputers: New Software Needed

InformationWeek (12/31/13) Patience Wait

Software now presents the greatest challenge in supercomputing due to the need for code that matches processing capability.  The top spot in the most recent ranking of supercomputers in November went to China’s National University of Defense Technology’s Tianhe-2 supercomputer, which reached a benchmark speed of 33.86 petaflops/second.  The next goal is exascale computing with speeds of a million trillion calculations per second, which  Argonne National Laboratory’s Mike Papka says is possible by 2020.  One hurdle to building faster supercomputers is creating an operating system that can manage that many calculations per second.  Argonne and two other national laboratories are addressing this challenge with the Argo project.  High-performance computing also increasingly is focused on applicability to other technology developments such as big data and analytics, open systems as opposed to proprietary systems, and energy efficiency as a requirement, says Brocade’s Tony Celeste.  “What matters is not acquiring the iron, but being able to run code that matters,” says DRC’s Rajiv Bendale.  “Rather than increasing the push to parallelize codes, the effort is on efficient use of codes.”

MORE