So Patrick Buehler and Andrew Zisserman at the University of Oxford, along with Mark Everingham at the University of Leeds, started by designing an algorithm that could let an artificially intelligent computer system identify individual signs.
Then, they let the system watch TV shows with both text subtitles and British Sign Language. After about ten hours of watching TV - well, watch the video and see for yourself.
The software correctly learned about 65% of the signs that it was exposed to.
Would this have been enough to betray Bowman and Poole in the famous HAL 9000 lip-reading incident in 2001: A Space Odyssey? Hopefully, we'll never know.
Related News Stories -
("
Artificial Intelligence
")
LLM 'Cognitive Core' Now Evolving
'Their only check on the growth and development of Vulcan 3 lay in two clues: the amount of rock thrown up to the surface... and the amount of the raw materials and tools and parts which the computer requested.' - Philip K. Dick, 1960.
Technovelgy (that's tech-novel-gee!)
is devoted to the creative science inventions and ideas of sf authors. Look for
the Invention Category that interests
you, the Glossary, the Invention
Timeline, or see what's New.
LLM 'Cognitive Core' Now Evolving
'Their only check on the growth and development of Vulcan 3 lay in two clues: the amount of rock thrown up to the surface... and the amount of the raw materials and tools and parts which the computer requested.'