Crime, law and regulation
Interested in technology meeting crime, law and regulation?
Crime, law and regulation research
Surrey Centre for Cyber Security is conducting cryptography research. Offering an arsenal of techniques to provide effective protection against the increasing cyber threat, cryptography is an integral part of many of their research projects. With outstanding expertise in this field, they focus on advanced techniques and applications such as high-function encryption schemes and digital signatures, authentication and key exchange protocols, and cryptographic solutions for privacy-preserving identity management, secure data sharing and information exchange.
Michael McGuire has published Cybercrime 4.0, which responds to the claim that criminology is becoming socially and politically irrelevant despite its exponential expansion as an academic sub-discipline.
AI at Surrey is conducting research on ethics, law and regulation in AI. Developing ethical, legal and regulatory framework to protect people, businesses and society from negative impacts of AI.
Michael McGuire has researched technology, crime and justice, considering how the 'rights' and 'wrongs' of technological use can be adequately categorised. To date, the scope of such questions have been limited – focused upon specific technologies, such as the internet.
In the court
The building of algorithmic tools to better understand the law in action is being researched by Surrey Law and Technology Hub in predicting court case outcomes with machine learning. The United Kingdom is lagging far behind the leading jurisdictions in access to the necessary data (especially: texts of court decisions and written arguments filed by litigants) and their planned activities involve both digitisation and publication of some types of court judgments.
University of Surrey’s Centre for Translation Studies has been commissioned by the Office of the Sussex Police and Crime Commissioner to undertake an independent evaluation of the Video Enabled Justice (VEJ) programme.
How online technologies are transforming transnational organised crime (Cyber-TNOC) was researched by Giulia Berlusconi. The project was led by Cardiff University and funded by the Economic and Social Research Centre (ESRC). Project leads were Professor Mike Levi, Dr Luca Giommoni, Professor Matthew Williams and Professor Pete Burnap from Cardiff University. Other contributors included Dr Giulia Berlusconi at University of Surrey, Dr Alberto Aziani at Università Cattolica del Sacro Cuore and Dr David Décary-Hétu at the University of Montreal.
Melissa Hamilton is researching police robots and the law. On July 7, 2016, Dallas officials used a robot to kill the suspected shooter of five police officers. This is believed to be the first time in domestic policing that a robot carried out an act of lethal force. After a prolonged gun battle and hours of negotiations with police, the suspect, then holed up in a confined area of a college building, continued to threaten to kill more and claimed to have planted bombs nearby. Investigating officers decided the time had come to incapacitate him. The police secured a one-pound chunk of C4 plastic explosive material to the extendable arm of a militarized robot. Then, they sent the robot into the confined area where the suspect was concealing himself. The police officer operating the robot remotely blew up the attached explosive, and as intended, the suspect was immediately killed.
Privacy and security in AI addresses challenges in cybersecurity and biometric and protecting the privacy of individuals and organisations by fusing blockchain with AI.
Centre for Cyber Security is conducting privacy and authentication research. Mobile communications are driving improved benefits for consumers across many sectors, but these services can pose challenges in terms of preserving privacy. Their work on privacy and authentication aims to build protection mechanisms into emerging technology in the rail and automotive transport sectors.
Security research, by the Centre for Vision, Speech and Signal Processing (CVSSP) examines biometrics related technologies, specialising in facial recognition and natural language interfaces for human-AI collaboration. Building on their expertise in face analysis techniques, they are developing biometric solutions that exploit information such as face shape and texture, lip dynamics and soft biometrics.
Surrey Blockchain’s TAPESTRY project asks: what if we could harness the complex longitudinal and multi-modal signals within the digital personhood (DP) to support safe online interaction by citizens and consumers? The signals created through our digital interactions - photos shared, comments left, posts 'liked' etc. - weave a complex 'tapestry' reflecting our relationships, personality and identity. Yet, this 'digital personhood' is more than a basis for communication; commodification of the DP now fuels a billion-dollar industry in which AI is increasingly utilised to help make sense of, and extract value from the deluge of DP data.
Security verification – obtaining mathematical proof of the behaviour of systems and pieces of hardware – is embedded into security verification research, ensuring that formal guarantees are built into emerging technologies. Researchers at Surrey Centre for Cyber Security are experts in correctness, liveness (ensuring that the technology is always ready to take the next step) and verification of security properties (ensuring that the system is not behaving in an insecure way).
Interaction on Twitter has been part of Professor Nishanth Sastry's research on digital citizen engagement. Working with the Director of Research at the House of Commons Library, Professor Sastry and his student have identified patterns in how MPs and citizens talk to each other online on Twitter.
‘Sexting’ – the production and distribution of sexually suggestive images and messages using mobile phones and other technological devices has been the focus of Emily Setty’s research on youth sexting: replacing abstinence with ethics. Sexting involving young people, specifically those aged under 18, has attracted extensive attention from the media and, in wider public discourse, has come to be defined as a worrying social problem (or, ‘new norm’) that urgently needs to be addressed.
Paul Hodkinson has conducted research titled ‘bedrooms and beyond: youth, identity and privacy on social network sites’. This article considers young people’s identities and privacy on social network sites through reflection on the analogy of the teenage bedroom as a means to understand such spaces.