A new study funded by the Law Foundation highlights the risk of "deep fakes" on social media - which co-author Tom Barraclough says "make it look or sound like something happened when it didn't".
Examples range from a synthetic, younger version of Carrie Fisher being inserted into Rogue One: A Star Wars Story to popular Snapchat face-swapping filters to pornographers putting famous Hollywood stars' faces onto adult actors in X-rated scenes.
But Barraclough says we have multiple laws and guidelines that already cater to the risk - primarily the Crimes Act, which covers when deception is used for gain, the Harmful Digital Communications Act, which covers when it's used for malice and the Privacy Act because "the wrong personal information is still personal information".
Barraclough and his co-author Curtis Barnes (both lawyers turned researchers at the Brainbox Institute) warn that fake is a slippery concept.
"One person's great satire is another person's deceptive factual record," Barraclough tells the Herald.
Barraclough and Barnes report emerged, in part, from the fear that a deep fake attack on a political or business leader would lead to what he calls a "kneejerk" reaction from lawmakers.
Specific legislation targetting deep fakes, such as The Malicious Deep Fake Prohibition Act introduced to the US senate last year, risks violating human rights, Barraclough says.
"While there are legitimate harms that will justify getting the law involved, there are also human rights risks and risks of over-reach certain forms of human expression."
He says most of the time, "deep fake" technology is used by playfully, or as satire.
Exhibit A for his human expression argument is local satirist Tom Sainsbury, who uses a face-swapping app for his satirical political clips on Facebook - often targeting Simon Bridges or Paula Bennett.
Deep fake technology can be convincing - check out this clip of "Barack Obama" (warning language), or adept - watch this effort by US company DeepTrace that puts Steve Buscemi's face onto actress Jennifer Lawrence's body to show off its chops.
But its ever-evolving nature makes it difficult for the likes of Facebook to trace and police.
"We have in-built biological trust in the data derived by our eyes and ears," he says.
But he and Barnes also credit the general public with some nous.
"People understand the limits to which what they see and hear through video and audio recording is only a partial representation of reality," he says.
The biggest danger the researcher says, "is probably over-skepticism".
If something like the Jami-Lee Ross audio recordings of Simon Bridges was released in future, people might not believe it is true, he says.
He also fears that any legislation targeting deep fakes would hinder the ability of oppressed groups to find a multimedia voice on social media.
See full report: