FORT COLLINS — A team of Colorado State University researchers recently received a $2.1 million grant from the Defense Advanced Research Projects Agency to develop technology that would enable computers to recognize non-verbal commands such as gestures, body language and facial expressions.
The goal is to be able to some day allow people to communicate more easily with computers in noisy settings or when a person is hearing-impaired or speaks another language.
“Current human-computer interfaces are still severely limited,” CSU professor of computer science Bruce Draper, who is leading the project, said in a release from the school. “First, they provide essentially one-way communication: Users tell the computer what to do. This was fine when computers were crude tools, but more and more, computers are becoming our partners and assistants in complex tasks. Communication with computers needs to become a two-way dialogue.”
Researchers have set up an interface at which a subject sits down with blocks, pictures and other stimuli while the researchers communicate with and record the person’s natural gestures for various concepts like “stop,” or, “huh?”
The team hopes to use the information gathered to create a library of information “packets” about various gestures or expressions that are recognizable to computers and constrain how the visual cues can be read.
“We don’t want to say what gestures you should use,” Draper said. “We want people to come in and tell us what gestures are natural. Then, we take those gestures and say, ‘Okay, if that’s a natural gesture, how do we recognize it in real time, and what are its semantics? What role does it play in the conversation? When do you use it? When do you not use it?”
Computer-science professors Ross Beveridge and Jaime Ruiz and math professors Michael Kirby and Chris Peterson are co-principal investigators on the project.