Smartphone owners know using their phones isn’t quite that easy: They have to look at the screen and swipe it with their fingers to answer a call and touch it to end a call.
With help from Google, Ruiz, an assistant professor at Colorado State University, is developing software for smartphones so they work better for people who can’t see the screen – those who are visually impaired. The idea is to program a phone so that blind people can give it commands by using various motions.
Ruiz, who teaches in CSU’s Computer Science Department, landed the university’s first-ever Google Faculty Research Award earlier this year. Google is giving him $50,000 to hire a graduate student for one year. The grant covers tuition and travel for the student, and provides faculty and students the opportunity to work directly with Google scientists and engineers.
Plenty of expensive assistive devices for the blind already exist, but consultants say there is a need for less-expensive technology, such as the software envisioned by Ruiz.
“Anytime we can do something with mainstream technology, we’re just that much farther ahead,” said Tanni Anthony, a consultant on visual impairment for the state Department of Education.
Ruiz’s work on the software began when he interned for Google at the Mountain View, Calif., Internet company’s research division while studying for his doctorate in computer science at the University of Waterloo in Ontario. Hired at CSU in August, Ruiz believes it will take four to five years to complete the software project.
Ruiz will make use of technology that already exists in smartphones but that is underutilized.
Smartphones come with sensors that detect movement. Turning a smartphone will reorient content on the screen. Shaking an iPhone will create a prompt to undo typing.
Ruiz hopes to tap this unused potential for people who cannot use a touch screen, or when it doesn’t make sense to do so. Maybe they’re wearing gloves. Or maybe they want to skip a step and more safely answer the phone while driving.
“We imagine this is beneficial to everybody,” he said.
He has surveyed people to find out what kinds of gestures they would make with a phone for certain functions. Most people have told him that they would want to answer the phone simply by lifting it to their ears.
He has learned, however, that teaching people to correctly use more complex gestures is difficult. So, he has set out to create voice prompts that can tell users how to properly execute gestures if they aren’t doing them right.
Visually impaired students now go through a cumbersome process of trial and error as a voice tells them whether they are touching the right part of the screen.
Ruiz hopes to make everything easier by programming different gestures into the phone that correspond to specific apps. Waving the phone through the air in a check-mark motion could bring up the calendar app. Shaking the phone could bring it back to its home menu.
Ruiz is developing the software using the Android operating system, although he intends for it to be accessed from all mobile platforms. It will run in the background as users cycle between apps, answer or ignore phone calls or surf the Internet.
Anthony, the department of education consultant, said devices tailored exclusively to blind people are, by necessity, expensive because the user-pool is very limited.
“There’s a smaller demand in the market, so you’re going to see a higher cost,” she said.
The path Ruiz has chosen to develop his app should offset this trend because the technology comes with devices used by the general population, and offers tremendous advances for visually impaired people at a lower cost.
Bonnie Snyder, a technology consultant for the state education department, has tested Google products for accessibility. She said there’s little computer technology that caters to visually impaired people.
“At least they’re trying,” she said. “But they have a ways to go.”
When Ruiz finishes the new app, he plans to make it available to everyone by releasing it as free open-source software, meaning that other developers will be able to modify it and improve its functionality as long as they release the end product back to the community. The source code for the app will also be publicly available, which will allow developers to modify and improve the code.
“What we are hoping to do is develop an application people can download and use on their phones, as well as provide a toolkit that will allow developers to incorporate motion gestures into their applications without having to be an expert in motion gestures and recognition techniques,” he said.
For now, Ruiz needs to make the software nearly perfect because users have a low tolerance for poor functionality, he said.
“There are still a lot of steps between releasing something to everybody and where we’re at right now,” he said.