Google states that its new project allows smartphones to interpret and “read aloud” sign language. There’s no app, but there are algorithms developers can use to make their own apps.
Until now, this type of software has only worked on PCs, so it’s a huge and important step. The hearing-impaired community appreciated the project, but also noted that the tech might have problems fully translating some conversations. In an AI blog, Google research engineers Valentin Bazarevsky and Fan Zhang state that the project will be “the basis for sign language understanding”. It was developed in partnership with image software company MediaPipe.
“We’re excited to see what people come up with. For our part, we will continue our research to make the technology more robust and to stabilize tracking, increasing the number of gestures we can reliably detect,” a spokeswoman told the BBC. This is only the first step as the approach now misses any facial expressions or speed of signing and these, changing the meaning of what is being discussed.
You can learn more here.