So, how does it work? The MyVoice is composed of a built-in microphone, a speaker, a soundboard, a video camera, and a monitor. The prototype device basically looks like your average smartphone. The camera records the hand motions of a person speaking ASL, processes the video on the fly, and then serves up a spoken translation via an electronic voice. It can also work in the other direction, converting spoken words into sign language that is then displayed on the monitor.
The device was developed in cooperation with members of the deaf community, so that the team could get a feel for what functionality would be the most practical and helpful. The team included engineering technology students Anthony Tran, Jeffrey Seto, Omar Gonzalez, and Alan Tran, who collaborated with industrial design students Rick Salinas, Sergio Aleman, and Ya-Han Chen. They were all overseen by Farrokh Attarzadeh, associate professor of engineering technology, and EunSook Kwon, director of UH’s industrial design program.
Engineering student Seto says that the most challenging part of the project was gathering a database of sign-language images, which “involved 200-300 images per sign.” For the students involved, the project soon became about more than just a grade. As design student Aleman explains:
This wasn’t just a project we did for a grade … While designing and developing it, it turned into something very personal. When we got to know members of the deaf community and really understood their challenges, it made this MyVoice very important to all of us.
You can learn more about the MyVoice device on the University of Houston’s website.
No comments:
Post a Comment