A team of researchers at the Rochester Institute of Technology’s (RIT) Linguistic and Assistive Technologies Laboratory is working on computer animation techniques that will help translate written and spoken word into American Sign Language (ASL), according to an article in Slate.
The technology, known as motion capture, collects data from sign language movements to plug into mathematical models that help solve linguistic challenges, according to the article. Researchers hope to eventually utilize the model for apps and games that help deaf children learn language skills.
To read the article in its entirety, visit the Slate website here.
Source: Slate