Since people are closed to homes during the quarantine period, video-speaking applications are used for monitoring school lessons and for business meetings. As such, many video chat applications became the most used applications of this period. Google steps up this point for sign language. Google focuses on the perception of sign language for video conversations.
Not all people who use video chat applications can use the power of speech in communication. For this reason, it becomes important to use sign language for people with hearing impairment or speech impairment. Until now, there was no way for video calling algorithms to detect such a thing.
Google started looking for ways in which sign language could be detected during video calls. Thus, people who use sign language can be highlighted similarly when they start to explain something.
The importance of this actually emerges at the following point: In video chat applications, the speaker comes to the home screen. In other words, this feature, which is added to prevent sounds from interfering in crowded environments, makes people stand out and speak in turn.
Google has also published an article for machine learning. As explained in the article, Google will enable it to be perceived in video calls with this step it takes for sign language, and it will enable people with hearing impairments to express themselves more comfortably in conversations. The article also shows how the algorithm can distinguish between sign language motion and normal motion without compromising video quality.