The 2016 winners of the Lemelson-MIT student prizes include a graduate student using Google Glass technology to help people with autism and two undergrads who invented gloves that instantly translate American Sign Language (ASL) into spoken words.
Catalin Voss, a graduate student at Stanford University, developed software to analyze facial expressions and emotions using data from the Google Glass wearable computer device. The user then gets feedback from a smartphone. Prachi Patel, writing for Scientific American, interviewed Voss and learned more about the invention and its applications. An in-home trial with 10 children began in January and already shows promise, according to the article. Voss is refining the feedback mechanism and working out other bugs during the trail.
Behaviors noted by the families and teachers of the 10 trial users include increased eye contact and social engagement. In the article, Patel quotes other experts who believe the product offers benefits over other technology because it is used in real-time situations.
“It’s getting them to look up and focus on the face … a behavior that’s very difficult to learn,” says Nancy Tarshis, a speech-language pathologist at the Children’s Evaluation and Rehabilitation Center at Albert Einstein College of Medicine.
Another prize-winning invention also focused on communication sciences disorders. One of the undergraduate teams—Thomas Pryor and Navid Azodi of the University of Washington—won for their ASL-translating gloves.
These gloves contain sensors that collect data based on ASL gestures and send the data wirelessly to a central computer. The computer compares the data and matches them to ASL signs. The matching word or phrase is then spoken through speakers on the gloves or appears as text on a smart phone. All this happens instantaneously:
The gloves are among other innovations involving the use of wearable technology to provide ASL translation.