Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations

Arizona lawmakers want truth in advertising when it comes to political deep fakes

By Howard Fischer
Capitol Media Services

PHOENIX -- Just in time for campaign season: another new state law designed to help voters figure out what images they are seeing are real and what are not.
Gov. Katie Hobbs has signed legislation that requires anything that purports to show a candidate within 60 days of an election have "clear and conspicuous disclosure'' of whether it is actually a "deepfake.'' But it remains unclear whether SB 1359 is constitutional.
The move comes a week after Hobbs penned her approval to HB 2394 designed to protect candidates.
That bill, crafted by Rep. Alexander Kolodin, R-Scottsdale, allows those who are running for office to go to court to get a judicial declaration that the person in the picture, video or audio is not them.
But unlike the measure signed earlier this week, there is no penalty. And a court cannot order it removed from the web or airwaves.
At the heart of both measures is artificial intelligence and, more to the point, the ability to use computers to create a picture, video or recording of someone else, without their knowledge or consent. More to the point, according to Sen. Frank Carroll, is the ability to put words into someone else's mouth.
"Say I wanted to make a deep fake of Chairman (Quang) Nguyen,'' he told members of the House Judiciary Committee which the Prescott Valley Republican chairs.
He said the technology would be able to find various images and voice samples of Nguyen "because we're all tied to the internet'' through everything from cell phones to social media.
"So I can produce something that looks like Chairman Nguyen, except I can cause it to do something uncharacteristic or untrue about him,'' Caroll testified.
"I can make him the most evil person in the world if I so describe it,'' he said. "But it's not really him that's doing it.''
His SB 1359 signed by Hobbs goes a step beyond what Kolodin got enacted.
It spells out that anyone who distributes a "synthetic media message'' that the person knows is a "deceptive and fraudulent deepfake'' of a candidate on the ballot in the next 90 days must also provide "clear and conspicuous disclosure'' that it was generated using artificial intelligence.
And it further defines "deceptive and fraudulent'' as something that the producer not only knows is false with the intent to injure the candidate's reputation and is "intentionally calculated to mislead a reasonable person into concluding that a real individual said or did something that they did not say or do in reality.''
Marilyn Rodriguez told lawmakers she understands the concerns of those pushing new state laws. But the problem, according to the lobbyist for the American Civil Liberties Union of Arizona, is that it's illegal.
"We do not think it's going to pass constitutional muster in court,'' Rodriguez said.
The key, she said, is that the U.S. Supreme Court has concluded that speech -- even false speech -- is protected by the First Amendment. The fact that this "speech'' is being produced through technology, Rodriguez argued, is legally irrelevant.
"As a general matter, the First Amendment means the government has no power to restrict expression because of its message, its ideas, its subject matter or its content,'' she said.
"Speech cannot be censored simply because it is misleading,'' Rodriguez said. "The fact that a deepfake or recording depicts something that is untrue does not rob it of its First Amendment protections.''
She said there are constitutionally permissible restrictions on such speech. But Rodriguez said it has to fall within the parameters of things like harassment, fraud, extortion or "true threats.''
As to election matters, she said the courts have allowed some government regulation.
But here, too, Rodriguez said, there has to be some "compelling governmental interest,'' such as precluding messages that mislead people about voting requirements and procedures and actions designed to disenfranchise voters. Even then, she said, such restrictions kick in only when there is intentional misrepresentation of voter information -- and not about candidates or their positions.
Kolodin, however, said those arguments miss the point.
"False speech is absolutely First Amendment protected,'' he said. But the issue, said Kolodin, is not whether a statement is false but the fact that someone is being impersonated -- that someone is holding them self, through a deep fake, to be someone else.
"It's like if you signed my name on a loan document,'' he said. "That's what we're trying to target and prevent.''
"That is an issue,'' Rodriguez conceded. But she said that there need to be guardrails.
For example, Rodriguez noted, there already is a state law against criminal impersonation. But she noted that actually requires a showing that it was done with the intent to defraud.
In fact, Rodriguez said the ACLU is supportive of SB 1078, another measure still making its way through the legislative process. That proposal by Sen. John Kavanagh, R-Fountain Hills, builds on that existing law and adds that it also would apply to using a computer generated voice recording, image or video of another person "with intent to defraud other persons or with intent to harass other persons.''
Rep. Justin Heap, R-Mesa, acknowledged there is a 2002 U.S. Supreme Court case that does protect false speech. But he questioned whether the current justices would reach the same conclusion.
"We have a completely new technology,'' Heap said.
"This is video of someone with their voice speaking, saying something they didn't say,'' he said. "To me, it seems this brings in an entirely different dynamic.''
And, if nothing else, Heap said, a law like this would provide a basis for the Supreme Court to revisit its precedent.
"I actually hope this gets challenged because I think we need the court to give us some guidance on this and on new things,'' he said.
Don't expect to start seeing those disclosure requirements right away. The law Hobbs signed will not take effect until at least the beginning of September, long after the state's July 30 primary election.
The other question is how effective the law will be in ensuring disclosure.
As originally introduced, Carroll proposed a criminal penalty. But that was stripped out to get support, replaced by a fine of $10 a day for the first 15 days and $25 daily after that.
—--
On X and Threads: @azcapmedia