So Patrick Buehler and Andrew Zisserman at the University of Oxford, along with Mark Everingham at the University of Leeds, started by designing an algorithm that could let an artificially intelligent computer system identify individual signs.
Then, they let the system watch TV shows with both text subtitles and British Sign Language. After about ten hours of watching TV - well, watch the video and see for yourself.
The software correctly learned about 65% of the signs that it was exposed to.
Would this have been enough to betray Bowman and Poole in the famous HAL 9000 lip-reading incident in 2001: A Space Odyssey? Hopefully, we'll never know.
Teslas Have Minds, Says Elon Musk
'The machine scans the patterns of the mind; ...Impress these same waves on a robot computer.' - Frederik Pohl, 1955.
Technovelgy (that's tech-novel-gee!)
is devoted to the creative science inventions and ideas of sf authors. Look for
the Invention Category that interests
you, the Glossary, the Invention
Timeline, or see what's New.