How can AI systems understand, interact and communicate naturally with people?
Natural interaction between people and AI is essential for future AI enabled systems across all domains. Building on Surrey cross-disciplinary strength in AI for audio-visual machine perception of people, language translation, human perception and interaction design there is potential to lead future research in natural human-machine communication.
This will underpin the realisation of assistive systems in healthcare and in hospitality.
The Future of Robotics and AI
AI systems that mimic nature
Other fundamental research at Surrey, within the NICE group, explores how AI systems can mimic the natural and biological world – taking inspiration from the ways that living organisms adjust to new situations, make complex decisions and even heal themselves. Researchers are developing evolutionary and Bayesian algorithms which will help to optimise complex data-driven systems and improve neural architecture search. This work could open the door to machines – such as hybrid cars – which learn, interact and make decisions based on evolutionary principles.
Developing algorithms that think like humans
In order for machines to ‘think’ independently, or just support us in making challenging decisions, we first need to understand how humans extract knowledge so that we can develop novel machine learning algorithms and architectures which can mimic (and ultimately improve) the process.
Combining core research in this area with expertise in visual recognition, CVSSP is breaking new ground in image and video retrieval. It has recently developed a novel deep convolution neural network architecture, REMAP, which enables instantaneous recognition of large volumes (potentially billions) of objects in images and videos. The unique feature of this approach is that the objects of interest are not previously known: The system learns to recognise any object instantly having been shown a single example – just like a human would. CVSSP’s excellence in this field was recently demonstrated when it won first place among 218 teams globally in the Google Landmark Retrieval Challenge.
Language that means something
One concern as we become more automated is that certain groups of society will be marginalised since they are less able to access information and public services. This includes deaf or blind people (and those with hearing or visual impairments) as well as speakers of minority languages.
Surrey’s academics are working to make information more accessible in a range of ways. In the Centre for Translation Studies, Professor Sabine Braun and her team have recently explored the potential of AI to provide audio descriptions for visual content which is currently not accessible to visually-impaired people – such as on social media. They are also investigating how AI can be used to support, rather than replace, human interpreters by developing intelligent semi-automated systems which reduce their cognitive burden.
Professor Braun explains: “In settings such as healthcare and the law courts, the risks related to misinformation are high. We need to customise systems so that, in a healthcare environment for example, they can understand medical terminology, and cater for people speaking minority or ‘community’ languages. To achieve this, we need human:machine interaction. Otherwise, the technology will make things more unfair.”
Meanwhile experts within Surrey’s Centre for Vision, Speech and Signal Processing are improving accessibility for the deaf community, having successfully pioneered the world’s first machine capable of interpreting sign language in real time.
AI for robotics and autonomy
Surrey’s Centre for Vision, Speech and Signal Processing (CVSSP) and Space Technology for Autonomous systems and Robotics (STAR LAB) which is part of School of Mechanical Engineering Sciences are focused on developing novel autonomous systems which can not only see, but also learn and understand.
In CVSSP’s robotics lab academics are working on creating humanoid robots which use computer vision and AI to interact with the world. Just as human babies learn by exploring the relationship between what they perceive and their actions, machines follow a similar method of learning. Using the lab’s ‘Baxter’ robot, researchers are currently working with a major retailer to explore how the online shopping process could be optimised if robots were used in fulfilment centres to pick and bag products.
The Centre’s robot lab also houses the Surrey Autonomous Testbed, based on a Renault Twizy: a road legal, fully sensor-enabled car with autonomous control and on-board processing. This testbed is enabling researchers to design and validate a novel intelligent control platform enabling autonomous vehicles that not only drive but also think safely for themselves in complex and uncertain driving scenarios.