As the pandemic drags on, video conferencing has become increasingly important. Yet, for those who primarily use sign language to communicate, video-conferencing presents accessibility challenges. (It can be hard for a person to fully participate in discussion using only the chat function.) SLAIT, a German startup, has created a potential solution: a real-time sign-language translation engine. At this point, the engine can recognize about 200 words in American Sign Language (ASL) and translate short sentences.
For a long time, real-time ASL translation has been a goal without a solution. SLAIT combined an algorithm called MediaPipe from Google AI labs with their own neutral networks to create ASL translation software. According to SLAIT, “MediaPipe…enables [SLAIT] to track key features from the hands, arms and even facial expressions.”
SLAIT’s prototype is not ready for public use just yet. It needs training on more videos of sign language to increase its knowledge base and accuracy. Also, the SLAIT team still needs investment funds to be able to scale its operations, preferably from investors that share their ideals. As SLAIT works out these kinks, real-time sign-language translation will get closer and closer to reaching a larger market.