The creepiest ad I've seen all year
Hello! Welcome back to the newsletter, which has taken a hiatus coincidentally since pubs and restaurants re-opened post-lockdown.
Now on to the creepiest ad I’ve seen in quite some time. I won’t leave you in suspense, here it is:
An ad for an AI companion that casually dehumanises women.
Replika is a chatbot created by series A start-up Luka. You can download an app and speak to your Replika, the idea being that it becomes a friend, helpful for people who are lonely or lacking social connection. However, rather than describing a compassionate friend, the ad perpetuates the idea of a woman who has no needs of her own and is subservient to the (likely male) user. Some will claim this ad is ‘funny’ and that I should ‘chill out’. However, depictions like this are inherently problematic in a world where some feel entitled to specific behaviour from women, often leading to violence when they are rejected.
This year, violence against women and girls (VAWG) has gained increased public attention, with the brutal murder of Sarah Everard by a serving Metropolitan police officer being one of over 100 killings of women by men so far this year in the UK. On average, a woman is killed by a man every 3 days. Frustratingly, VAWG is often framed as a women’s issue (‘don’t walk alone after dark’, ‘don’t wear short skirts’, ‘watch out for your drink being spiked’) rather than a men’s issue (‘don’t abuse women’). Solutions to this problem should centre on the latter.
It doesn’t have to be like this.
Technology should be a positive force in the world, empowering us to solve problems and be better humans. However it can also exacerbate problems and biases in our society. For example, if someone conveys violent thoughts or behaves abusively towards their Replika, the AI-generated response should aim to teach them that this is wrong rather than reinforce the idea of being in control of women. Currently Replika is a text and voice-based chatbot, but as augmented reality (AR) becomes more ubiquitous and we move towards the ‘metaverse’, people will be able to interact with their Replika in what feels like reality. What happens when someone physically abuses the Replika in the AR world?
What do we do?
Algorithms have no moral compass. People working at tech companies are responsible for designing and advertising their products in a way that does not promote VAWG. The above ad suggests that the Replika team will happily dehumanise women for clicks - particularly sad given the company is founded by a woman.
There is a wider question here about how we can make ethics work for capitalism. This has been front of mind recently with Facebook being criticised for prioritising revenue over combating harmful content, such as hate speech and misinformation. The questionable ad might be performing well, but no conversion rate should be high enough that it takes precedence over ethics. After all, what is the point of technological progress if we aren’t using it to right what we know to be wrong.
Small edit: ‘Violence against women and girls’ (VAWG) is a commonly used term, however this should actually be described as ‘male violence against women’. Saying ‘violence against women’ implies that the violence just happens and does not attribute responsibility to the perpetrators of these crimes.
Links: