A start-up aiming to combat gender-based violence is close to completing work on an app, backed by artificial intelligence (AI), to identify ‘victim blaming’ in the justice system.
Tamara Polajnar, chief executive of herEthical AI, said the company’s overall aim was to “improve the culture around crime, so that things are done more efficiently and in a way that helps survivors”.
Ms Polajnar, whose background is in machine learning and natural language processing, said she was working on a project 18 months ago to evaluate a machine-learning algorithm created by West Midlands Police to predict when people involved in stalking became involved in violent assaults. The algorithm is yet to be used.
While working on the project she met Anthony Joslin, a former inspector and innovation lead at Devon and Cornwall Police.
They founded herEthical AI in May this year, with Ms Polajnar as chief executive and Mr Joslin as chief innovation officer. The other founders are psychologist Ruth Spence, chief research officer, and Hazel Sayer, chief communications officer.
The start-up aims to combat gender-based violence by providing machine learning solutions for police forces and third-sector organisations and by providing a consultancy service to the public sector on the use of AI.
It is developing an app to “identify and extract victim-blaming and misogynistic language” from court judgments and transcripts, working with Riverlight, a non-profit organisation based in London which campaigns for domestic abuse survivors.
Riverlight launched the campaign In the Judge’s Words in February to expose the “dehumanising language and attitudes that victims and survivors of abuse have endured from judges and magistrates in family court proceedings”.
Riverlight has described the findings as “deeply disturbing”, with examples including “judges minimising or dismissing abuse as not being ‘that bad’, insinuating that domestic violence is a ‘50/50 thing’, and even stating that a victim had ‘goaded’ the perpetrator into strangling her”.
HerEthical AI and Riverlight are now running a crowdfunding campaign, which has raised £1,200 so far, to enable domestic abuse survivors to buy their court transcripts to feed into “a large language model solution that can identify victim-blaming language and misogyny”.
Mr Joslin said the app was “80-90% complete” and already generating results. It will be accompanied by a “victim-blaming taxonomy” and an academic paper.
He said AI could be used to sift through, summarise and understand routine police intelligence, where every force had a “massive backlog” because they lacked the police officers to process it.
He was discussing with a police force how the app could be used in the context of domestic abuse, to codify text and give it a “score”, enabling recommendations on “how well they are faring” in terms of writing up statements.
The app could highlight where there were problems and where “people are not changing the underlying culture” and “internalising bias” in the way they wrote investigation reports.
Ms Polajnar said that, as with all AI, the results would need to be checked and staff would need training on “how to be more mindful with victims”.
On the consultancy side, Mr Joslin was keen to talk to senior leaders in the public sector, including the police, Crown Prosecution Service and local authorities, to “articulate the case for why AI is inevitable and they need people with AI skills”.
Ms Polajnar added that herEthical AI was self-funded and there were no immediate plans to raise external funding.
“People are looking to change the culture on domestic abuse and gender violence. We would love to make a difference in this area.”
Leave a Comment