AI can be great but with Taylor Swift falling victim to deepfakes, how can the rest of us protect ourselves?

2WFYXHA

Taylor Swift’s globally recognised face was recently superimposed onto pornographic pictures. And as advertising copywriter Lizzie Hutchison rightly warns, if it can happen to her, it can happen to anyone. 

A few months ago a mate and I were mucking about at work, coming up with ads for fake products. Our favourite was a massage oil called ‘Consensual’. The tagline? ‘For when she’s absolutely, definitely said yes!’ 

But in light of recent deepfake stories, this doesn’t seem so funny. A few weeks ago, Taylor Swift had her face superimposed on pornographic images, and shared on X - with one photo racking up 47m views before it got removed. Grim.

I went to parliament (suprised they let me in with my Barbie phone case tbh) to watch a screening of ’My Blonde GF’, a documentary short film that tells Helen Mort‘s experience with deepfake and image based online abuse. What struck me was how interlinked power and consent are.

To continue reading, register today for more access!

If you are a subscriber or a registered user, or if you already have a login for another Premier website SIGN IN HERE

 

WA Oct 2024 - Mockup

Sign up for your free account now!    

Registering is quick and easy and gives you immediate access to read more articles, plus:

  • You’ll receive a weekly newsletter every Saturday with the top stories of the week
  • You can save articles to read later
  • You can share your comments and thoughts on the stories

REGISTER NOW


Or subscribe today  for unlimited access! Special offers are available!

If you already have an account with a Premier website SIGN IN HERE