Posts

Villasenor on Strategies to Guard Against ‘Deepfakes’

Public Policy Professor John Villasenor spoke to the Brookings Cafeteria podcast about strategies that voters and other consumers of digital media can adopt to guard against “deepfakes” — videos manipulated with artificial intelligence technology to deceive, parody or, sometimes, educate. “Anybody who has a computer and access to the Internet is in a position to produce deepfakes,” Villasenor said, but he added that the technology to detect the doctored videos is also quickly evolving. He urged consumers of digital media to “unlearn what we’ve learned since we were all small, which is usually seeing is believing. … Deepfakes scramble that understanding.” Even if a video is clearly fake, he said, “the visual imagery is very powerful and so I think it’s a big concern.” Villasenor is a professor of management, law and electrical engineering, in addition to public policy. He is a nonresident senior fellow at the Brookings Institution.

 

Villasenor on Easy Access to Powerful Technology

Public Policy Professor John Villasenor spoke to Business Insider about “deepfakes,” phony videos and digital images manipulated using artificial intelligence. Easy access to both the technology to alter videos and the platforms to distribute them widely has heightened concern about deepfakes, Villsasenor said. “Everyone’s a global broadcaster now. So I think it’s those two things together that create a fundamentally different landscape than we had when Photoshop came out,” he said. Altered videos can be used in satire and entertainment, creating complications for legal efforts to crack down on malicious users. Time constraints are also an issue, Villasenor said, citing deepfakes used in political attacks. “Election cycles are influenced over the course of sometimes days or even hours with social media, so if someone wants to take legal action that could take weeks or even months,” he said. “And in many cases, the damage may have already been done.”


 

Villasenor on ‘Deepfakes,’ Free Speech and the 2020 Race

Public Policy Professor John Villasenor narrated a short Atlantic video on the proliferation of “deepfakes,” videos and audio manipulated using sophisticated technology to convincingly present fiction as fact. Deepfakes are “engineered to further undermine our ability to decide what is true and what is not true,” he said. “We are crossing over into an era where we have to be skeptical of what we see on video.”  Villasenor, who studies the intersection of digital technology with public policy and the law, predicted that deepfakes will be used to deceive voters during the 2020 presidential campaign yet cautioned against aggressive laws to rein them in. While the technology could harm targeted individuals, the First Amendment protects free expression, including many forms of parody, he said. “As concerning as this technology is, I think it’s important not to rush a whole raft of new laws into place because we risk overcorrecting,” Villasenor said.


 

Image of multiple screens

Villasenor on ‘Deepfakes’ and the Uncertainty of Truth

Public Policy Professor John Villasenor wrote a piece for the Brookings Institution on “deepfakes” and the uncertainty of truth as a result. Villasenor defined deepfakes as intentionally manipulated videos that make a person appear to say or do something they, in fact, did not. He suggested three strategies to address this issue: deepfake detection technology, legal and legislative remedies, and an increase in public awareness. Artificial intelligence would detect image inconsistencies due to video manipulation, he said, adding that legal and legislative actions must strike a balance to protect people from deepfakes without overstepping. He said viewers can combat deepfakes by refusing to believe questionable videos are real. “That knowledge won’t stop deepfakes, but it can certainly help blunt their impact,” he said. Villasenor is currently a nonresident senior fellow in Governance Studies and the Center for Technology Innovation at the Brookings Institution.