26
Feb
2015

An article from Benjamin Recchie of the University of Chicago Research Computing Center looks at how CI Senior Fellow John Goldsmith and graduate student Jackson Lee use high-performance computing to better understand how computers -- and by extension, humans -- learn the rules of language.

24
Feb
2015

Since its announcement last summer, the Array of Things (AoT) urban sensing project has been gradually refining its technology and strategy for its expected pilot launch this spring.

23
Feb
2015

A new model of electric shock injury run on the CI’s Beagle supercomputer by the research group of University of Chicago Medicine’s Raphael Lee hopes to understand the harmful effects of electricity, inspiring new treatments and protective gear.

16
Feb
2015

When something goes wrong while you're running a program on your personal computer, the worst outcome is typically a reboot and the loss of any unsaved work. But when an application crashes on a supercomputer, the consequences can be much more dramatic. In his talk at the Computation Institute, Argonne's Franck Cappello discussed new resilience strategies for the next era of supercomputing.

12
Feb
2015

To encourage building owners to assess and reduce their energy usage, the City of Chicago passed the Building Energy Use Benchmarking Ordinance in 2013, requiring certain properties to report energy data. Late last year, that mandate produced the first Building Energy Benchmarking Report, containing insight and visualizations produced in part with the CI’s Urban Center for Computation and Data (UrbanCCD).

11
Feb
2015

Science appears to be slowing down. Over the course of Knowledge Lab Postdoc Aaron Gerow's research in how scientific publications influence one another, he found that as fields develop bodies of research, their citations tend to lag several years behind the present day.
 

09
Feb
2015

When people talk about the current tech boom, it usually conjures up images of phone apps, social media networks, and startups with one-word names. But inside the public sector, a quieter tech revolution stirs, as governments increasingly recognize the power of data to help them serve their constituents more effectively. This trend creates a new kind of skills gap, as governments look for people with both the technical skills and civic motivation to analyze data and build tools for internal and external use. 

06
Feb
2015

More and more industries now use modeling and simulation as critical tools for engineering and design. But as the detail and scale of these simulations grows larger and larger, many companies hit a computational ceiling, unable to perform these advanced calculations as quickly as needed. With a boost from the Chicago Innovation Exchange, Parallel.Works, a new startup company from CI scientists, hopes to provide industries with parallel computing solutions to break through this barrier with a minimum of fuss.

05
Feb
2015

When the metagenomics platform MG-RAST was launched in 2007, data was scarce. Created as a public resource for annotating microbial genomes from environmental samples, MG-RAST originally served a small community of scientists with relatively small datasets, due to the great expense of gene sequencing.

03
Feb
2015

The greatest scientific challenges of our time aren't contained by national borders. Subjects such as cancer and climate change affect billions of people globally, inspiring international efforts to better understand these phenomena and find more effective ways of reducing their harmful effects. Computation has the power to facilitate and accelerate these worldwide collaborations, providing the tools and analytic power for scientists in the developed and developing world alike to better understand these problems both locally and globally.