news
Published: 26 May 2020

Sign-language technology named best R&D project by National Association of Broadcasters

CONTENT4ALL, an EU funded project which has successfully used animation technology and AI algorithms to create live sign-language programming with 3D virtual humans, has been awarded the prestigious NAB Technology Innovation Award

Getty Images
Deaf man using tablet

Given by the National Association of Broadcasters (NAB) in the US, the award recognises the best research and development projects which have not yet been commercialised, in the field of communications technologies.

Surrey’s Centre for Vision, Speech and Signal Processing (CVSSP) was part of the CONTENT4ALL project consortium which included the Franhofer HHI Research Centre (Germany), broadcasting companies VRT (Belgium) and Swiss TXT (Switzerland), and Human Factors Consult (Germany). The consortium was led by international system integration consultancy Fincons Group.

CONTENT4ALL will result in the development of new, innovative services targeting the production of more sign-interpreted content for the Deaf. Its main goal is to reduce the production costs for TV broadcasters in order to satisfy upcoming legislation, while significantly increasing the available media content for Deaf users in a way that avoids disturbing hearing viewers with sign-interpretation.

Reproducing a sign language interpreter as a realistic virtual human in real-time, it enables the sign language interpreter to be located in a remove broadcasting studio. The technology is based on real-time video on video technology and advanced deep learning algorithms which track body, face and hand movements and derive parameters to animate the virtual human.

The work has been led at Surrey by Dr Anil Fernando and Professor Richard Bowden within CVSSP.

Dr Fernando said “I’m pleased that CVSSP’s expertise on video coding and communication technologies has helped to create a significant social benefit, enabling greater accessibility to content for the Deaf community and moving a step closer to removing media poverty in real-time broadcasting. I am confident that the video on video technology developed in CVSSP will have a significant impact on developing other services such as interactive social television broadcasting in the near future.”

Professor Bowden said “My team and I have been working on sign language recognition for over 20 years but it’s only recently, with the move to deep learning, that we have been able to tackle the full translation pipeline. The last couple of years have seen some exciting developments from our lab with two worlds firsts: the first end-to-end sign to spoken language translation system and the first translation system capable of generating sign from written or spoken text. It’s this technology that underpins this award. This is an exciting time.”

The NAB awards were announced during NAB Show Express, which took place online from 13 to 14 May as a result of the Covid-19 outbreak.

 

Discover our courses in electrical and electronic engineering.