Personal computers and smartphones have democratized computing around the world, giving regular people access to technology capable of conducting billions of calculations per second. But a significant computing gap remains between consumer technology and the high-performance computers used in national laboratories and universities, and not just because of the state-of-the-art supercomputers that only these institutions can afford to run.
Most people don't want to imagine an earthquake or deadly disease outbreak hitting their city, but for a team at Argonne National Laboratory, envisioning the worst is their job. Chicago Magazine recently profiled Argonne's Global Security Sciences division, including the Social and Behavioral Systems group led by CI Senior Fellow Chick Macal, which uses agent-based modeling techniques to simulate various types of disaster scenarios.
Since the 2016 election, there has been much discussion of "fake news" -- false stories propagated over social media, usually with a political slant. But climate researchers have been all too familiar with this phenomenon for much longer, pushing back against media reports that push unscientific claims and distorted portrayals of the climate change "debate." So it's no surprise that this same scientific community is leading the charge against unreliable science articles, with a new initiative that drafts researchers into volunteer fact-checking.
Agent-based modeling can be used to simulate any number of complex scenarios, from the evacuation of a city after a natural disaster to the actions of the immune system after a gunshot wound. The Complex Adaptive Systems group at Argonne National Laboratory, led by CI Senior Fellow Chick Macal, is a leader in developing these models, including creating a simulation of the entire city of Chicago to test the spread of MRSA, ebola, and other diseases.
Given its expeditionary namesake, it's only appropriate that Beagle -- the University of Chicago's supercomputer for biomedical research -- works with data from all around the world. But a recent project may qualify as the farthest-traveling data yet, as the HPC resource was used in a new genomic study of populations living in the Himalayan mountain range.
It's been almost a year since Chameleon, the experimental cloud computing testbed co-run by the Computation Institute and Texas Advanced Computing Center, went into full production for research use. Already, 600 users and 150 projects have used the system to test new uses and technologies for cloud computing, from finding unknown exoplanets to preventing cyberattacks. Last week, HPCwire spoke to CI Senior Fellow Kate Keahey and other members of the Chameleon team, surveying its early successes and previewing the innovations still to come.
UrbanCCD Director and Computation Institute Senior Fellow Charlie Catlett was named one of 25 “Doers, Dreamers & Drivers” of 2016 by Government Technology. The honor celebrates his work creating partnerships between Argonne National Laboratory, University of Chicago, and the City of Chicago on innovative projects such as Array of Things, Plenario, and OpenGrid.
While Data Science for Social Good sorts through nearly 900 applications for their 2016 summer fellowship, their 2015 projects continue to attract interest. Last week, the Charlotte Observer profiled DSSG's collaboration with the Charlotte-Mecklenburg Police Department, using data on officers, arrests, dispatches, and other sources to help predict negative interactions between police officers and the public.
In two recent studies, CI Senior Fellows James Evans and Andrey Rzhetsky built a network of millions of papers to ask an important question: is scientific research living up to its potential? Their analysis, conducted with UCLA's Jacob Foster and CI Director Ian Foster, found that science increasingly explores more incremental and conservative questions, avoiding the