It makes sense to use AI to recognize visual interaction such as sign language. But it is another thing again to use it to assess sign language, because here we have a case where we're trying to recognize something when it's being done incorrectly. This study (18 page PDF) looks at such a system being piloted with 100 words (out of more than 3,000) in the Swiss German Sign Language (Deutschschweizerische Gebärdensprache, DSGS). "The assessment system that we present here," write the authors, "provides adult sign language as a second language (L2) learners of DSGS with feedback on the correctness of the manual parameters (hand-shape, hand position, location, movement) of isolated signs they produce." The productions deemed incorrect "were then further linguistically analyzed to inform future sign language instruction practice." A future iteration of the system will be offered as an online assessment people can use in their homes.
Today: 0 Total: 38 [Share]
] [