Deepfake porn will soon be so easy to use that creating the sinister vids will be a simple as using Instagram filters, a tech expert has warned.
The rise in deepfake videos has been swift, with celebrities and politicians alike finding themselves duplicated into disturbingly lifelike clips.
A recent video showing comedian Bill Hader morphing into Tom Cruise has received millions of views online, spooking viewers with it's uncanny accuracy.
Now ordinary citizens may find themselves on the receiving end of the digital fakery, with Shamir Allibhai, CEO of video verification company Amber, telling the Daily Star that "it will soon be as easy to create a fake as it is to add an Instagram filter".
Allibhai warned: "Women will be the primary target of the weaponisation of this deepfake technology."
While fake pornography featuring celebrities is commonplace, Allibhai said the evolution of deepfake software will lead to ordinary people being caught up in the trend, with devastating consequences.
It could lead to an increase in revenge porn cases and wreck relationships.
"The havoc is two-fold. At a primary level, relationships will be broken, people will be blackmailed." Allibhai told the Daily Star.
"On a deeper level, society will become cynical if we don't have video veracity solutions in effect."
"We will evolve to become distrusting and view everything with scepticism.
"And when this happens, it will chip at trust among citizens, a foundation of democracies."
The CEO of Instagram was slammed earlier this year for refusing to remove deepfake videos of celebrities, a decision he said would have been "inappropriate".
Adam Mosseri told CBS that he didn't "feel good about it" but wouldn't be taking the clips down because the "damage was done".
"We are trying to evaluate if we wanted to do that and, if so, how you would define deepfakes," he said.
"If a million people see a video like that in the first 24 hours or the first 48 hours, the damage is done. So that conversation, though very important, currently, is moot."