As of today, the Data Science for Social Good fellowship is one-third of the way done. In their downtown Chicago headquarters, 48 fellows from around the world are hard at work with project partners from the non-profit and government sectors, helping them solve problems and get the most value out of their data. On the Data Science for Social Good website, the first round of projects have been announced with a small blurb describing the summer goals of each team. Topics this year range from education to energy efficiency, worldwide corruption to natural conservation, reducing lead exposure to reducing homelessness, and enrolling those without health insurance to connecting patients with services outside of the health care system. Here are three samples of the Data Science for Social Good projects, visit their projects page or the DSSG blog to read about the rest.
When people talk about the most serious effects of climate change, they typically mention hotter temperatures, severe storms, rising sea levels, economic instability, and food security. Typically, the latter threat is measured in terms of the amount of calories produced by world agriculture, to quantify projected changes in maize, soy, wheat, and rice production under new climate conditions. But in a new editorial for Nature Climate Change, CI fellow Joshua Elliott and two co-authors explain how new research on changes in nutritional value of food grown in high-CO2 conditions suggests that calories alone may insufficiently describe the food security threat.
The open release of city data has given residents exciting new ways of interacting with and benefiting from the information collected by city agencies. But what if there was a way to collect even broader, higher-resolution data on the daily life of the city, providing a massive stream of open data for research and the development of new applications to improve urban life? The Array of Things is a project of the CI's Urban Center for Computation and Data (UrbanCCD) to deploy interactive, modular sensor boxes around Chicago to collect real-time data on the city’s environment, infrastructure, and motion for research and public use. While the first nodes won't be installed along Michigan Avenue until later this summer, the media has seized upon the idea as an exciting new way of "instrumenting" a city for the greater good.
On the third anniversary of President Barack Obama establishing the Material Genome Initiative (MGI)—a multi-agency effort to transform materials science research in the United States through a national infrastructure—a consortium of research universities, national laboratories, and academic publishers announced the Materials Data Facility today.
Last year, in an ornate downtown Chicago ballroom, the seeds were planted for a new multidisciplinary research network with an ambitious purpose: to understand and improve cities. By mixing together experts in computer science, public health, education, architecture, urban planning, art and social science, the Urban Sciences Research Coordination Network (USRCN) hoped to create versatile and knowledgeable teams that could find new approaches to study cities in a rapidly urbanizing world. Sixteen months later, the early fruits of those new collaborations helped inspire a new wave of discipline-crossing partnerships at the 2nd USRCN meeting, organized by the Urban Center for Computation and Data and held inside the world famous Art Institute of Chicago.
Computer simulations that reveal a key mechanism in the replication process of influenza A may help defend against future deadly pandemics.
Treating influenza relies on drugs such as Amantadine that are becoming less and less effective due to viral evolution. But University of Chicago and Computation Institute scientists have published computational results that may give drug designers the insight they need to develop the next generation of effective influenza treatment.
There's a new debate heating up in the world of climate modeling -- not the fictitious "debate" that plays out in the media over climate change and its causes, but a contest over the best methods to forecast how climate change will affect the planet. Until now, the dominant approach has been deterministic models, which use environmental variables and equations replicating physical laws to run numerical simulations of climate. But as these models seek higher and higher resolution, they become extremely expensive computationally, without much improvement in forecasting accuracy.
For software geeks, the breakout star of the Apple’s Worldwide Developers Conference (WWDC) this year wasn’t the next Macbook or iPhone, but a new language called Swift, for programming Apple devices. But since 2007, Computation Institute computer scientists have created and supported a completely different Swift: a high-level programming language to make fast parallel computing on any system easier for scientists, engineers, and data analysts.
The movement towards open data from city governments has inspired the development of new methods for data analysis. But what about new methods for the collection of data? Beginning this summer, the CI's Urban Center for Computation and Data will work with the City of Chicago to install 30 to 50 "sensor nodes" on light poles in the downtown area, giving researchers and the community new streams of information on climate, traffic, city infrastructure, and other facets of city life.
Under the specter of a warmer future, scientists must study the downstream effects of climate change on humans, including the impact on agriculture, the economy, and society. But the scale of global climate models and regional models of agriculture, hydrology, and other sectors may be orders of magnitude apart, forcing researchers to find novel methods of closing that gap.