Blog

Finding a better way to fight cancer doesn't always mean discovering a new drug or surgical technique. Sometimes just defining the disease in greater detail can make a big difference. A more specific diagnosis may allow a physician to better tailor a patient's treatment, using available therapies proven to work better on a specific subtype of disease or avoiding unnecessary complications for less aggressive cases.

"Finding better ways to stratify kids when they present and decide who needs more therapy and who needs less therapy is one of the ways in which we've gotten much better at treating pediatric cancer," said Samuel Volchenboum, Computation Institute Fellow, Assistant Professor of Pediatrics at Comer Children's Hospital and Director of the UChicago Center for Research Informatics. "For example, kids can be put in one of several different groups for leukemia, and each group has its own treatment course."

Blog

Big science projects can afford big cyberinfrastructure. For example, the Large Hadron Collider at CERN in Geneva generates 15 petabytes of data a year, but also boasts a sophisticated data management infrastructure for the movement, sharing and analysis of that gargantuan data flow. But big data is no longer an exclusive problem for these massive collaborations in particle physics, astronomy and climate modeling. Individual researchers, faced with new laboratory equipment and methods that can generate their own torrents of data, increasingly need their own data management tools, but lack the hefty budget large projects can dedicate to such tasks. What can the 99% of researchers doing big science in small labs do with their data?

That was how Computation Institute director Ian Foster framed the mission at hand for the Research Data Management Implementations Workshop, happening today and tomorrow in Arlington, VA. The workshop was designed to help researchers, collaborations and campuses deal with the growing need for   high-performance data transfer, storage, curation and analysis -- while avoiding wasteful redundancy.

"The lack of a broader solution or methodology has led basically to a culture of one-off implementation solutions, where each institution is trying to solve their problem their way, where we don't even talk to each other, where we are basically reinventing the wheel every day," said H. Birali Runesha, director of the University of Chicago Research Computing Center, in his opening remarks.

Blog

A CITY PROJECT BATTLE ROYALE

As the keynote speaker at the Urban Sciences Research Coordination Network kickoff last Friday, the City of Chicago's Brett Goldstein presented a blizzard of exciting city projects at various states of development. One slightly-under-wraps project Goldstein touched upon was the SmartData platform, an ambitious plan to craft a new tool for decision-making and city services out of the abundant raw material of city data. In collaboration with the Computation Institute and the Urban Center for Computation and Data, the city's Innovation and Technology team hopes to create a tool that will analyze the city's many large datasets in real time to help the city respond to challenges more quickly and efficiently, while providing frequently updated, useful information to its citizens.

Wednesday, that exciting new effort was announced as a finalist in the Bloomberg Philanthropies Mayors Challenge, a competition among ideas proposed by cities across the United States. As part of the judging, the public is invited to vote for their favorite project among the 20 finalists at the Huffington Post. We're biased of course, but to help make the case for Chicago's project, you can read more about the SmartData platform here, or watch a video about the concept featuring Mayor Rahm Emanuel below.

{C}

Blog

Computation is now an essential tool for researchers, as data analytics and complex simulations fuel ambitious new studies in the sciences and humanities. But the path from a spreadsheet on a laptop to using the world's most powerful supercomputers can be intimidating for researchers unfamiliar with computational methods.

To help researchers along this journey, the University created the Research Computing Center (RCC), providing access to hardware and expertise to faculty and students. At an opening reception on November 8th at the Crerar Library, scientists from Argonne National Laboratory and IBM joined RCC director H. Birali Runesha in welcoming UChicago researchers to this valuable new resource.

"The mission of the Research Computing Center is to advance research and scholarship at the University," Runesha said. "What we are trying to do here is not just to provide access to hardware, but to work with you to understand your research and integrate high-performance computing into it to achieve our major goal, which is to help you literally transform your research by performing computational analysis that would otherwise not be possible."