Person typing on laptop

Abusers are exploiting all manner of smart tech and software to extend their capacity for coercive control

6 min read

Perpetrators of domestic abuse are increasingly exploiting digital tools to coerce and control their victims. Where there is abuse in a relationship, technology will also feature in how that abuse is conducted. Police forces now expect as much, when responding to cases of domestic abuse.

Such technological abuse features everyday tools, from smart devices to online platforms and mobile phone apps. And the information on where to find them and how to use them is easily accessible online, often using a simple Google search.

To understand the extent of this problem, we conducted a wide-ranging study for the UK government. We reviewed 146 domestic abuse cases reported in British and international media, and conducted in-depth interviews with support charity workers and frontline police officers in England.

We found that abusers often have physical access to their partners’ devices and use them to monitor, harass and humiliate. Abusers can force their victims to disclose passwords, PIN codes or swipe patterns to get into their devices so they can install spyware – all without sophisticated tech knowledge.

Geolocation software and other surveillance spyware provide new possibilities for abusers to monitor and track victims’ movements. In our study, we found hundreds of tools online that could be used for these purposes.

Surveillance

Some apps are hint at the possibility of allowing hidden surveillance. One survey found a 93% increase in the use of spyware and “stalkerware” apps since the beginning of the pandemic.

We also found that there are tracking apps which are designed for legitimate purposes, such as child or anti-theft protection, and which are widely available on equally legitimate sites and app stores. Research shows these have been exploited to spy on or reportedly to stalk a partner (or ex-partner). Studies now refer to them as dual-use apps.

Similar concerns have been voiced about covert monitoring devices and smart tech such as Apple’s AirTags. These small bluetooth devices are designed to be paired with tracking apps for finding lost belongings, such as car keys. But stalkers have reportedly exploited them too.

It’s not just smart devices that are being used to access personal information. Smart locks, thermostats, networked TV and sound systems, as well as security monitoring equipment are also being exploited to control and terrify victims – to monitor their movements and any visits they get.

Further, where an abuser has access to cloud-based voice services, they will be able to access past conversations, order information and other data that might give them insights into the plans of a victim, potentially even if they are planning to leave.

Harassment

We found that fake accounts on online platforms and social media are often set up with abusive intent. They can be used to present the victim in a derogatory manner. A man in Liverpool was jailed after he listed his ex-girlfriend’s workplace in accounts set up in her name on swinger and dating platforms.

Legally, this is a grey area. Hacking a person’s account is a clear criminal offence, while impersonating someone to create a fake account is not. In some but not all instances, it can be argued that doing so constitutes cyber-harassment.

A case in point is the man who, in 2018, reportedly set up a fraudulent Facebook profile of his ex-wife in which he falsely claimed she fantasised about being raped. Because he included contact details in the profile, a random stranger turned up at her workplace to meet her.

Similarly, in 2017, another man allegedly set up fake Grindr accounts in the name of his ex-boyfriend. Over 1,000 men turned up at the victim’s house and workplace, looking for sex.

Elsewhere, perpetrators are engaging in image-based sexual abuse. People might threaten to release intimate pictures or videos to retain control over their victim.

In other instances we noted that perpetrators, in setting up fake social media profiles of their victims, have used these to disseminate intimate images of their victims. Other means of distributing these materials have been to send them directly to friends, family, and employers, as well as publishing them publicly online.

The term “revenge porn” is widely understood as the sharing or distribution of nude or sexual images by jilted ex-lovers whose primary motivations are revenge or retribution. It does not, however, capture the full range of motivations under which perpetrators might be operating, from blackmail and extortion to control, sexual gratification, voyeurism, social-status building and monetary gain. It also focuses attention on the content of the image, rather than on the abusive actions of perpetrators who misuse nude or sexual images.

Technological abuse does not require IT proficiency. Perpetrators are using everyday, affordable, accessible tech. What we need is a better, more accurate definition of what constitutes domestic abuse and support services that are equipped to deal with it. As one charity worker we spoke to put it:

If you or anyone you know has been a victim of any of the aspects we discussed above, there is help available. Please reach out to Refuge Freephone 24-Hour National Domestic Abuse Helpline: 0808 2000 247 or visit www.nationaldahelpline.org.uk (access live chat Mon-Fri 310pm)

Dr Lisa Sugiura is Senior Lecturer, and Postgraduate Programme Area Leader at the School of Criminology and Criminal Justice in the Faculty of Humanities and Social Sciences.

This article is republished from The Conversation under a Creative Commons Licence. Read the original article.

More The Conversation Articles...

The Conversation is an independent source of news analysis and informed comment written by academic experts, working with professional journalists who help share their knowledge with the world.