Compiler 1/25/13: A Bumpy Path to Exascale


Last October, we helped celebrate Petascale Day with a panel on the scientific potential of new supercomputers capable of running more than a thousand trillion floating point operations per second. But the ever-restless high performance computing field is already focused on the next landmark in supercomputing speed, the exascale, more than fifty times faster than the current record holder (Titan, at Oak Ridge National Laboratory). As with the speed of personal computers, supercomputers have been gaining speed and power at a steady rate for decades. But a new article in Science this week suggested that the path to the exascale may not be as smooth as the field has come to expect.

The article, illustrated with an image of a simulated exploding supernova (seen above) by the CI-affiliated Flash Center for Computational Science, details the various barriers facing the transition from petascale to exascale in the United States and abroad. Government funding agencies have yet to throw their full weight behind an exascale development program. Private computer companies are turning their attention away from high performance computing in favor of commercial chips and Big Data. And many experts agree that supercomputers must be made far more energy-efficient before leveling up to the exascale -- under current technology, an exascale computer would use enough electricity to power half a million homes.

If these factors prevent the United States from reaching the exascale first, China or Japan may be the country that breaks the tape. But in a live chat about the article, Jack Dongarra from the University of Tennessee, Knoxville suggested that an international effort to reach the exascale might be the most beneficial to advancing science worldwide.

Supercomputing is often presented as a race, with nations vying for leadership to preserve industrial/economic/research competitiveness. With respect to software, it seems clear that the scope of the effort to develop software for exascale must be truly international...It serves global scientific communities who need to work together on problems of global significance and leverage distributed resources in transnational configurations.



When people think about computers playing chess, many remember the 1990's battles between IBM's Deep Blue and the grandmaster Garry Kasparov. But before human fought machine over the chessboard, the machines battled each other in an annual chess competition organized by the Association from Computing Machinery from 1971 to 1994. Those tournaments are the subject of a new film, Computer Chess, that premiered at the Sundance Film Festival this week, and among the stars of the film is the CI's own Gordon Kindlmann. At the Huffington Post, Kindlmann discussed moonlighting from his day job as assistant professor of computer science to act in an independent film, and put the computer chess competitions into the context of the continuing quest to create artificial intelligence.

Along the way, the AI research community created foundational methods of efficiently navigating the space of moves and countermoves in games like chess. They also learned that there is much more to mimicking human thought than winning games. But then, what tasks should we use as indicators of successful artificial intelligence? What have we learned about ourselves when computers start matching our abilities? How do we manage our relationship to computing technology as it shifts and grows around us? Bujalski's film offers a glimpse into a time when we were learning how to ask those questions.

The film won the Alfred P. Sloan Feature Film Prize at Sundance. You can watch the trailer for Computer Chess here.



After magnetic tape, floppy disks, CD-ROMs and flash drives, will the next medium for storing data be DNA? Ed Yong at the National Geographic and John Timmer at Ars Technica write about the early days of DNA storage.

Many pizza chains now offer the option of placing an order online. But scientists were already using computers to order pizza way back in 1974.

Can you write an algorithm to efficiently sort socks from a pile of clean laundry?

Written By: 
Research Tags: