Fighting online hate crime
Online hate crime is a growing phenomena. Recent reports by two charities in Wales revealed that disabled people were seeing a marked increase in online abuse. It was similar story when Galop showed the frightening rise in online abuse suffered by members of the LGBTQ+ community.
These reports were written against the backdrop of a 60 per cent increase in hate crime offences reported to the police between 2013-2020. And that doesn’t take into account acts that weren’t reported – online or otherwise – or touch on the racist abuse suffered by several England footballers in the wake of the team's Euro 2020 final defeat.
One Surrey academic, however, is conducting research to analyse and combat this.
Professor Nishanth Sastry arrived in Surrey after studying and working in India and the USA.
“I investigate different kinds of networks,” says Nishanth. “I started off by looking at computer networks. More recently, I’ve been examining social networks as well.
“Both are constructed in similar ways. Single entities, such as people in social networks or nodes in computer networks, are linked together by mechanisms, like friendships in the former or wired and wireless connections in the latter.
“My approach to analysing either is to take a large data set and look for patterns, then see how we can improve how it operates or learn about its underlying mechanics.”
And it’s this approach that led to his work on hate crime.
Examining online hate crime
“I’m currently working on two projects involving online hate crime,” reveals Nishanth. “The first is with The Alan Turing Institute.
“There are many different platforms and tools for promoting hate speech online, but they don’t get benchmarked against each other. So, if an activist wants to spread hatred or if an academic wants to investigate it, which platforms or tools should they use?
“We’ve created a meta tool capable of applying multiple tools to obtain a combined output, which can be tuned to the requirements of different users or application scenarios.”
MPs and online hate crime
“The second project is an offshoot of that and we use this meta tool to analyse the type of hate speech MPs receive.
“There are many kinds of hate speech. There’s the overt and criminal type that uses abusive and offensive language. It can be directed at female MPs because of their sex and it can involve racial slurs if MPs are from BAME backgrounds.
“But there’s more subtle hate speech that isn’t legislated for. And we’re interested in that, too.
“We have a large dataset of tweets by and directed towards the MPs who are on Twitter and we’re investigating that. The size of the dataset allows us to place comments in the context of longer exchanges, which may affect how we view it.
“Ultimately, it will help us understand what types of situation or exchanges create hate speech. It may help us draft guidelines for how we can diminish the likelihood of hate speech occurring on a more widespread basis.”
Nishanth admits his career in computer science happened by accident.
He explains: “I did well in an end-of-school exam and I didn’t know what to do next. I took a computer science degree because it was the preferred subject to take at the time. I’d never even used a computer before and I didn’t have one in my house until my second year of undergraduate study.”
He also says he’s not sure where this journey will take him next. But he wouldn’t be surprised if the location was cold!
“I was born in Bangalore, which is essentially the Silicon Valley of India. After my degree, I joined US company Cisco Systems, who’d set up offices in Bangalore.
“A year later, I went to the University of Texas in Austin to study for a masters. A job at IBM in Boston, Massachusetts, followed. Then I went to Cambridge in the UK to study for my PhD.
“Apart from a stint in London and my current role at Surrey, I’ve been heading increasingly northwards. Who knows? I may end up teaching at the North Pole at some point!”
Find out more about studying in our Department of Computer Science.