Google is working hard on its artificial intelligence to develop a solution or application that will help us understand sign language.
Indeed, the American company claims that it is working on a project that will facilitate all communication, a solution that will allow us to use our smartphone to understand sign language.
Even at the development stage, this solution will limit the tools needed for such a service to reduce them to a simple smartphone, there are already solutions capable of sign language translation, but they require the cessation of powerful computers, such as SignAll or Kintrans.
Google researchers explain that the development of such a tool is very complicated and, surprisingly, they rely on machine learning, and also publish this project in open software to allow the participation of several communities.
Google has simplified the detection process in 21 hand gestures, and now the work is focused on translating facial expressions.