Act MP Laura McClure’s Bill to criminalise sexually explicit “deepfake” images has been drawn from the biscuit tin.
The Deepfake Digital Harm and Exploitation Bill would amend existing laws to expand the definition of an “intimate visual recording”.
It would widen what a “recording”is to include images or videos that are created, synthesised or altered to depict a person’s likeness in intimate contexts without their consent.
“Since I lodged my bill, I’ve heard from victims who’ve had their lives derailed by deepfake abuse,” McClure said.
Laura McClure, who has been a victim of an edited deepfake AI nude photo, emphasises the need for action, citing rising complaints and impacts on victims and schools. Photo / Supplied
McClure said Parliament now had an opportunity to empower victims of deepfake abuse with a clear pathway towards the removal of the images and the prosecution of their abusers.
“My bill does not seek to ban a whole subset of technology or create a new regulatory regime. It is a simple tweak to existing laws to address a specific, well-defined and understood problem.