Langbort Earns CAREER Award; Co-directs New IT-based Center

4/2/2013 Written by Susan Mumm

AE Assistant Prof. Cedric Langbort's research efforts can be compared to a very high-tech game of hide and seek.

Written by Written by Susan Mumm

AE Assistant Prof. Cedric Langbort’s research efforts can be compared to a very high-tech game of hide and seek.

On the one hand, Langbort recently has earned a National Science Foundation Faculty Early Career Development Program (CAREER) Award for a project that studies the effects of information cloaking as a means of defense against potential cyber-attacks. On the other hand, as Co-Director for the new Center for People and Infrastructures, Langbort wants to help consumers become more aware of the underlying workings of electronic systems used in everyday life.

“In one case, I’m making information visible, and in another, I’m hiding it,” he said.

Professor Cedric Langbort
Professor Cedric Langbort
Professor Cedric Langbort

 
Langbort’s CAREER Award for his project, “A Dynamic Game Theoretic Approach to Cyber-security of Controlled Systems,” involves designing control algorithms that manage the flow of information made available to a potential hacker.
 
“I try to mitigate in the best possible way any attacks on the system. It’s a game theory approach in which I look at a number of different scenarios and strategic intent,” Langbort said. “Even if (a hacker) modifies the algorithm, the control logic itself should be resilient enough to not go out of balance.”
 
The CAREER Award provides Langbort with $400,000 over five years to pursue his research.
 
The information that a power system may gain about a consumer in the transfer of data from a meter to a control center may not be transparent to the consumer. That’s where Langbort’s work for the People and Infrastructures center comes in.
 
“More systems are becoming smart, such as smart meters that monitor consumption in the power grid. In some ways, the model that those systems end up building for their users based on this information can be useful, but it also has them pigeonholed, trying to make the users behave according to the systems’ assumptions.”
 
For example, he said, when an individual decides to google a topic, Google will suggest possible sites to be examined based on trends the system has gathered about the person from past searches. “The suggestions made to you might not be the same suggestions made to another person searching for the same topic” Langbort maintained.
 
By doing this, he said, systems “put us in our own little bubble; they limit our choices and potentially isolate us from other points of view.”
 
Langbort and his colleagues in the center aim to help consumers become aware that their information is being collected and possibly being used to manipulate them. “We feel that users need to know that this is going on and should have a decision in whether they want it to go on or not,” he said. "We would like you to know, at least, who the system thinks you are, so to speak."

Christian Sandvig (media and cinema studies) is CPI's co-director and founder. In addition to Langbort, other CPI co-directors include Sally Jackson (communication), Kevin Hamilton (art and design), who specializes in arts and media/new media, and Karrie Karahalios (computer science), who concentrates on human/computer interactions. 

The CPI is currently funded by the Department of Aerospace EngineeringDepartment of Computer Science, the Coordinated Science Laboratory, and the College of Media at Illinois.


Share this story

This story was published April 2, 2013.