Blog

The ATLAS experiment at CERN is one of the largest scientific projects in history, with thousands of scientists from around the world working together to analyze the torrents of data flowing from its detectors. A new analytics platform built from open source tools by CI scientists at the ATLAS Midwest Tier Center 2 will make those experiments more efficient.

Blog

The hot buzzword in the tech world right now is “disruption,” the concept that one clever idea can completely shake up a stale industry, leading to new practices and big profits. Companies such as Amazon, Skype, and iTunes have dramatically changed how book stores, phone companies, and music sales work, with sometimes controversial results. But for many reasons, health care has largely resisted major tech-driven revolutions so far, its massive bulk and entrenched interests providing disruption-proof armor few other industries can boast. But at the Big Data & Health conference, co-organized by the Computation Institute and the UChicago Center for Health and the Social Sciences (CHeSS), many speakers signaled that data-based change was on the way for health care and research.

Blog

In the era of “Big Data”-based science, accessing and sharing of data plays a key role for scientific collaboration and research. The San Diego Supercomputer Center (SDSC) at the University of California, San Diego, has implemented  a new feature of the Globus software that will allow researchers using the Center’s computational and storage resources to easily and securely access and share large data sets with colleagues. ​SDSC is the first supercomputer center in the National Science Foundation’s XSEDE (eXtreme Science and Engineering Discovery Environment) program to offer the new and unique Globus sharing service. 

Blog

Imagine a time when your car is constantly sending data about your driving habits to your insurer.