Monday, July 3, 2017

How Deep Learning Can Translate American Sign Language

Deep learning has accelerated machine translation between spoken and written languages. But it’s lagged behind when it comes to sign language. Now Syed Ahmed, a computer engineering major at the Rochester Institute of Technology, is unleashing its power to translate between sign language and English. “You want to talk to your deaf or hard of hearing friend, but you don’t know sign language, what would you do in that case?” says Ahmed, a research assistant at the National Technical Institute for the Deaf, in a conversation with Michael Copeland in this week’s episode of the AI Podcast. Ahmed fed around 1,700 sign language videos into a deep learning algorithm. The model was able to analyze physical movements and translate it into American English. “You point your phone at your friend and while they sign, automatic captions appear on your phone,” Ahmed said.