• April 2018
    M T W T F S S
    « Aug    
     1
    2345678
    9101112131415
    16171819202122
    23242526272829
    30  
  • Advertisements

Report Says Computer Science Should Be Treated Like Core Science Subjects in K-12 Classrooms

Education World (06/01/16) Nicole Gorman

Only a fraction of U.S. schools offer computer science and most lack the ability to teach students the core principles of the subject, according to a new report from the Information Technology & Innovation Foundation (ITIF). The report says curriculum and standards focus on using, instead of understanding, technology. The study notes there should be significant changes to how computer science is taught in grades K-12, considering how in-demand computer science majors are and will be in the future. “In 2011, Code.org projected that the economy would add 1.4 million computing jobs by 2020, but educate just 400,000 computer science students by then,” the study says. ITIF says a curriculum overhaul would optimize students’ success. “To maintain the field’s current momentum, the perception of computer science needs to shift from its being considered a fringe, elective offering or a skills-based course designed to teach basic computer literacy or coding alone,” the report says. First and foremost, the report recommends the U.S. train and develop 10,000 additional teachers to teach computer science. There also needs to be a focus on creating innovative education policy that favors teaching computer science principles in both K-12 and university classrooms.

MORE

Advertisements

Chameleon: Why Computer Scientists Need a Cloud of Their Own

HPC Wire (05/05/16) Tiffany Trader

The U.S. National Science Foundation-funded Chameleon cloud testbed in less than a year of operation has contributed to innovative research in high-performance computing (HPC) containerization, exascale operating systems, and cybersecurity. Chameleon principal investigator Kate Keahey, a Computation Institute fellow at the University of Chicago, describes the tool as “a scientific instrument for computer science where computer scientists can prove or disprove hypotheses.” Co-principal investigator Dan Stanzione, executive director of the Texas Advanced Computing Center at the University of Texas at Austin, says Chameleon can meet the oft-denied request from the software or computer science research community to make fundamental changes to the way the machine operates. With Chameleon, users can configure and test distinct cloud architectures on various problems, such as machine learning and adaptive operating systems, climate modeling, and flood prediction. Keahey says support for research at multiple scales was a key design element of the instrument. One project using Chameleon involves comparing performance between containerization and virtualization as they apply to HPC applications. Keahey says it is “a good example of a project that really needs access to scale.” Another major Chameleon user is the Argo Project, an initiative for designing and prototyping an exascale operating system and runtime.

MORE

 

The Internet Was Delivered to the Masses; Parallel Computing Is Not Far Behind

Virginia Tech News (08/21/14)

Virginia Polytechnic Institute and State University (Virginia Tech) professor Wu Feng has been a leader in the field of parallel computing, especially in relation to the field of biomedicine. In the mid-2000s, Feng worked on a multi-institutional project to combine the capabilities of supercomputers at six U.S. institutions in an early example of parallel computing in the cloud. Feng recently has brought together funding from a wide range of sources, including the U.S. National Science Foundation and the National Institute of Health, to pursue research into ways of making the power of parallel computing more readily available to everyone. He says this has become increasingly important as the rate at which data is being generated dramatically outstrips the pace of advances in computing power. Feng says to handle the vast and ever-growing amounts of data generated in a number of fields and industries requires access to the power of parallel computing. To that end, Virginia Tech is establishing a new research center: Synergistic Environments for Experimental Computing. The center will focus on multidisciplinary efforts to design new algorithms, software, and hardware focusing on five areas of synergistic computing: the intersection of computer and physical systems, the health and life sciences, business and financial analytics, cybersecurity, and scientific simulation.

MORE

High-Performance Data Replication Across Cloud Servers

Phys.Org (06/24/14)

Computer scientists in China have developed a system that can provide high-performance data replication across cloud servers. The approach keeps data in sync, which is a major problem facing widespread adoption of cloud services. Earlier attempts to limit data conflicts for users accessing many different servers across the globe lack scalability and are slow, according to Anirban Kundu, Lin Luan, and Ruopeng Liu of the Kuang-Chi Institute of Advanced Technology in Shenzhen. They say their approach to cloud synchronization is so efficient that only low-category cloud servers are needed for implementation. The cloud server-side environment has a network structure with distinct levels for controlling data transfer in different segments based on the user/system requirements and ferrying information packets through dedicated channels being used in parallel as required. Parallel execution takes less time to finish a particular task by dividing it into several asynchronous activities on different machines using a distributor/scheduler and then bringing these individual parts back together in the appropriate order for the end user or output.

MORE

12 Predictions for the Future of Programming

InfoWorld (02/03/14) Peter Wayner 

Forecasting the next hit technology is challenging due to the rapid pace of change, but InfoWorld offers 12 predictions for the future of programming over the next five years. One prediction is that the graphics processing unit (GPU) will take center stage from the central processing unit, as applications are increasingly written to use the parallel architecture of GPUs. In addition, databases are likely to manage increasingly advanced analysis, running more sophisticated algorithms on tables, searching more efficiently for data patterns, and carrying out big data tasks. Javascript and Android will grow more and more dominant in programming. The Internet of Things will increase the number of new programming platforms, with automobiles being the most significant. Open source code will continue to present a challenge in finding the revenue to support development. WordPress Web applications will proliferate, as dominant frameworks offer functionality that makes it unnecessary to create apps from scratch. Plug-ins will increasingly supplant full-fledged programs. The command line will remain relevant because it is too flexible and universal to be replaced, and it will continue to work with modern tools. Both outsourcing and insourcing will continue, as work is outsourced to reduce costs but also performed by new automated tools.

MORE

We the Internet: Bitcoin Developers Seed Idea for Bitcloud

Phys.Org (01/27/14) Nancy Owano 

Bitcoin developers want to decentralize the current Internet and replace it with a new Internet. The Bitcloud developer group has proposed a peer-to-peer system for sharing bandwidth, and enabling individual users to complete computing tasks such as routing, storing, or computing in exchange for cash. The Bitcloud approach would enable nodes on a mesh network to be rewarded financially for routing traffic in a brand-new mesh network. The researchers note their cash model would eliminate the need for Internet service providers. The proposal is based on the ideas of Bitcoin, Mediaglobin, and Tor, and the researchers are looking for more developers to assist with the project. They point out that the initiative demands a “massive amount of thought and development in many different parts of the protocol, so we need as many people helping as possible.” Cloudcoins would be the system’s currency, and would operate similar to bitcoins as the currency of the Bitcoin protocol. “If you’re interested in privacy, security, ending Internet censorship, decentralizing the Internet, and creating a new mesh network to replace the Internet, then you should join or support this project,” according to a recent appeal on Reddit.

MORE

Researchers Implement HPC-First Cloud Approach

HPC Wire (01/29/14) Tiffany Trader 

North Carolina State University researchers have demonstrated a proof-of-concept for a novel high-performance cloud computing platform by merging a cloud computing environment with a supercomputer. The implementations show that a fully functioning production cloud computing environment can be completely embedded within a supercomputer, allowing users to benefit from the underlying high-performance computing hardware infrastructure. The supercomputer’s hardware provided the foundation for a software-defined system capable of supporting a cloud computing environment. This “novel methodology has the potential to be applied toward complex mixed-load workflow implementations, data-flow oriented workloads, as well as experimentation with new schedulers and operating systems within an HPC environment,” the researchers say. The software utility package, Kittyhawk, serves as a provisioning engine and offers basic low-level computing services within a supercomputing system. Kittyhawk is what allowed the researchers to construct an embedded elastic cloud computing infrastructure within the supercomputer. The HPC-first design approach to cloud computing leverages the “localized homogeneous and uniform HPC supercomputer architecture usually not found in generic cloud computing clusters,” according to the researchers. This type of system has the potential to support multiple workloads, including traditional HPC simulation jobs, workflows that involve both high-performance computing and non-high performance computing analytics, and data-flow orientation work.

MORE