Privacy and security
Addressing challenges in cybersecurity and biometric. Protecting the privacy of individuals and organisations by fusing blockchain with AI.
Does AI make us more safe, or less?
AI poses a huge challenge in terms of cybersecurity because it involves vast amounts of data which may be highly sensitive (such as medical records). However AI is also improving our security online through blockchain, which enables the storing of tamper-proof data without relying on a single person or organisation.
The University’s Surrey Centre for Cyber Security, an Academic Centre of Excellence in Cyber Security Research recognised by the UK government’s National Cyber Security Centre, is finding innovative ways to protect the privacy of individuals and organisations.
One example is its use of AI techniques to tackle online safety issues such as predatory grooming or production of first-generation indecent images of children. The Centre’s spin-out company Securium Ltd has developed ways of picking up nuances in chatroom and livestreamed communications, separating in real time what would otherwise appear to be a normal online conversation from one in which a child is being groomed.
Fusing blockchain with AI for the public good
While much blockchain research focuses on cryptocurrencies, Surrey is taking a unique approach in this area by fusing blockchain (trusted data) with AI (making sense of that data) for the public good. Surrey Blockchain is at the forefront of Distributed Ledger Technology (DLT) with a research portfolio undertaking cross-disciplinary research that is transforming domains as diverse as digital records, healthcare, electronic voting, trust and identity.
Within this network researchers are exploring novel AI techniques for fingerprinting or ‘hashing’ data, and entrusting these hashes to DLTs to create a time-stamped, immutable record of the data. Applications include proving the provenance of images and video (for example to tamper-proof public archives and fight against deep fakes and dissemination of fabricated news), and proving an individual’s identity through behavioural fingerprinting of their social media activity.
As well as protecting us online, AI is helping to keep society safe in the physical world. Facial recognition technology is already widely used, for example at UK airports, but to date it has required subjects to be looking directly into a camera and standing completely still to be effective.
Biometrics experts within Surrey’s Centre for Vision, Speech and Signal Processing are addressing this challenge by developing solutions that exploit information such as face shape and texture, lip dynamics and soft biometrics to provide a more advanced solution. The Centre’s vision of developing ‘unconstrained’ facial recognition – which works regardless of image quality, lighting, environment, pose and other factors – is being brought to fruition under the EPSRC FACER2VN research programme in collaboration with the Home Office’s Centre for Applied Science and Technology.