Smart technology is being tapped to help deaf people, and there are many technologies that scientists and educators are looking into to help not only the deaf but people with disabilities. Now they are working on live subtitling as a probable tool to help people who cannot hear, know and understand what’s being said visually.
New technologies and tools
Deaf people may soon find it easy to know what is being said, if the experiments being done today to adapt the latest technologies to their needs, will make life easier for them even if they cannot hear.
The deaf are some of the people with disabilities to use several technologies easily. They are able to enjoy TV shows and movies with subtitles. They are able to use text phones and the Internet and even if they cannot hear, they are able to read messages, which allows them to connect to a larger community, even the hearing world. They are able to easily communicate, despite with their disability.
The advent of wearable technology seems to be another first for the deaf and hearing impaired. Google Glass provides such opportunities with live subtitling and ability to call an in-vision interpreter.
- In the United Kingdom
In the UK, 121 Captions’ Tina Lannin and Microlink’s Tm Scannell are working together to make Google Glass useful for deaf people. They have developed their own live captioning software called MiCap that works on Google Glass, e-book, smartphone, tablet and laptop. The process pairs two devices, the listening device, possibly a tablet and the Google Glass. The listening device transcribes what is being said and the captions scroll on Google Glass’ screen with just a delay of a second or two. While a smartphone has that capability as well, this one gives the wearer that chance to look at the speaker while reading the subtitles, which makes the communication seem more natural.
- In the Unites States
A group of students at Brigham Young University is doing the same thing. Their project is called “Signglasses” and they are developing an app that will project the American Sign Language narration onto the lens of Google Glass when they have presentations. Associate Professor Mike Jones is the head of the project together with student researchers from the university. Some of the researchers are hearing impaired as well. They initially received funding from the National Science Foundation and has recently received additional funding from the Sorenson Impact Foundation. The research group and the university are also enjoying the presence of several students that are fluent in American Sign Language.
The research team of BYU is working with Georgia Institute of Technology researchers to expand the use of this technology as a tool for literacy. Associate Professor Jones hopes to publish their findings very soon. Industry analysts are closely looking at Google Glass and related technologies for their great use in education, especially its application in instructional video recording.