John Villasenor

John Villasenor is a professor of electrical engineering, public policy, law, and management at UCLA, a nonresident senior fellow at the Brookings Institution, and a National Fellow at the Hoover Institution. He is also a member of the Council on Foreign Relations, and an affiliate of the Center for International Security and Cooperation (CISAC) at Stanford. Villasenor’s work considers the technology, policy, and legal issues arising from key technology trends including the growth of artificial intelligence, the increasing complexity and interdependence of today’s networks and systems, and continued advances in computing and communications.

He has written for the AtlanticBillboard, the Chronicle of Higher EducationFast CompanyForbes, the Huffington Post, the Los Angeles Times, the New York Times, Scientific AmericanSlate, and the Washington Post, and for many academic journals. Prior to joining the faculty at UCLA, Villasenor was with the NASA Jet Propulsion Laboratory, where he developed methods of imaging the earth from space. He holds a B.S. from the University of Virginia, and an M.S. and Ph.D. from Stanford University.

For more information, please visit Professor Villasenor’s personal page.

John Villasenor on digital media sales, hardware hacking and banking for the poor Research on digital security and risk assessment

john-villasenor-profile

By Angel Ibanez
UCLA Luskin Student Writer

Public Policy and Electrical Engineering professor, John Villasenor, was recently featured in the media on the topics of selling used digital media and the growing danger of hacked hardware. He also co-wrote a blog post for the Brookings Institute on the role of the global financial system in helping the poorest and most vulnerable.

In the article titled, “Secondhand Downloads: Will Used E-Books and Digital Games Be for Sale?” published by Bloomberg News, reporter Joshua Brustein explained that the mechanics of selling used digital media are not clearly set and possibly not legal. Professor Villasenor offered one potential solution to addressing some of the issues in music: to establish a “short-term online lending library” for songs. 

Through this short-term lending library, the owner of the song would “lose access whenever someone else listened to the song he contributed.” When capitalizing on the song, the recording artist would only be able to “sell the number of copies of a song equal to the maximum number of people listening to it at any one time.”

Popular Science’s “Nowhere to Hide” piece discusses the growing problems that hacked hardware could cause for security in the future. The article references Villasenor’s research in which he stresses the realization that possible attacks are only a matter of time “the laws of statistics guarantee that there are people with the skills, access, and motivation to intentionally compromise a chip design.” This becomes an ever bigger problem when so little is being done to prepare for such a scenario, “defensive strategies have not yet been fully developed, much less put into practice.”

In a blog post co-written for the Brookings Institute last week, Villasenor and Peer Stein discussed how the current rise of “retrenchment by global financial institutions may be undermining years of progress in providing the world’s poor with financial services.”

The problem of retrenchment comes from large fines against banks for failing to comply with international sanctions and anti-money laundering rules. Banks are doing what is known as “de-risking” where they restrict or terminate business with clients to avoid risk. 

This has led to a rise in banks closing remittance accounts and has affected civil society organizations. One NGO involved in helping women’s groups in the Middle East was denied a bank account to avoid the risk of funds indirectly ending up in Syria. 

In order to address this important problem, Villasenor suggested three pillars necessary for finding solutions going forward:

1. Public authorities need to provide more meaningful information on ML/TF risks to the financial industry, clarify their regulatory expectations, and adopt a genuinely risk-based approach in their supervisory and enforcement actions.

2. Financial institutions need to step up their understanding of the risks of their customer base, and direct internal control efforts accordingly. Risk management approaches should focus more on individual clients, and not write off entire sectors.

3. Countries with significant inflows of remittances need to improve the effectiveness of their regulatory regimes to combat ML/TF, and to provide more comfort to global financial institutions with banking relationships with clients in the developing world.

Youth Internet Safety: Risks, Responses, and Research Recommendations Professor John Villasenor identifies gaps in existing knowledge concerning internet safety.

176193280_0By Luskin Center for Innovation staff

As Internet use by children and teenagers increases, so do concerns about their online safety. Providing a safe environment requires an in-depth understanding of the types and prevalence of online risks young Internet users face, as well as the potential solutions for mitigating risks.

A team at the Luskin Center led by Public Policy professor John Villasenor conducted a review of existing research on online safety and then identified knowledge gaps and recommendations for specific areas of research to further the policy dialogue regarding online safety. These findings and recommendations are summarized in a paper released today by the Brookings Institution.

This paper is timely because, despite the significant amount of research on these risks, improving youth Internet safety remains a challenge. In part, this is because definitions of terms and categories relevant to online safety, such as “cyberbullying,” often vary, making the comparison of statistics and findings among sources imprecise. In addition, there are complex overlaps among different online safety subtopics.

The paper was authored by John Villasenor, graduate student Adina Farrukh and researcher Rebecca Sadwick. Read the full paper here.

 

Why Blaming Health Care Workers Who Get Ebola Is Wrong Public policy professor John Villasenor argues that data and statistical methods can protect health workers.

Since the Centers for Disease Control and Prevention confirmed on Oct. 12 that a Texas health care worker tested positive for Ebola, media outlets have reported that health officials are now “scrambling” to find out how she contracted the disease despite wearing protective gear. According to the head of the CDC, the infection was caused by a “breach in protocol” that officials are working to identify.

Public Policy professor John Villasenor argues in an article published on Forbes.com that while working to identify weak links in protocol is important, blaming the health worker for breaching protocol ignores the fact that, statistically, having multiple contacts with Ebola patients will lead to an inevitable “limited number of transmissions to health workers.”

He writes:

“This is because if you do something once that has a very low probability of a very negative consequence, your risks of harm are low. But if you repeat that activity many times, the laws of probability—or more specifically, a formula called the “binomial distribution”—will eventually catch up with you.

For example, consider an activity that, each time you do it, has a 1% chance of exposing you to a highly dangerous chemical. If you do it once, you have a 1% chance of exposure. If you do it twice, your chances of at least one exposure are slightly under 2%. After 20 times, you have an 18% chance of at least one exposure, and after 69 times the exposure probability crosses above 50%. After 250 times, the odds of exposure are about 92%. And the exposure odds top 99% after about 460 times.

In other words, even if the probabilities are strongly stacked in your favor if you do the activity only once, with repetition the probabilities flip against you.”

Villasenor ends his article by offering three recommendations for how to analyze this situation, including avoiding assumptions. You can read the full article here.

In another piece published in Slate on Oct. 15, Villasenor asserts that big data should be used as a “core component of the strategy” to protect health workers from Ebola exposure. Big data and statistical methods are vital in analyzing how Ebola can spread and shouldn’t be treated as an afterthought, he says.

Villasenor urges health officials to collect data about interactions between health workers and Ebola patients, and develop protocol for simulations so that health workers can practice using and removing protective gear.

He concludes:

“Big data and statistics alone aren’t going to keep health workers safe from Ebola. But they can certainly help. If we are going to ask health workers to repeatedly step into rooms with patients contagious with a virus that now appears to have a fatality rate of about 70 percent, we have the obligation to do everything possible to minimize the chances that they might be exposed. And today, we’re not doing nearly enough.”

John Villasenor Talks Bitcoin with Patt Morrison

Villasenor.jpeg_e

Public Policy professor John Villasenor was recently interviewed by Patt Morrison for her Los Angeles Times column on the topic of cryptocurrencies. In the Q&A, Villasenor answers questions about the efficiencies of cryptocurrencies like bitcoin, how bitcoin works, and possible implications on future regulation.

When asked what may be the most difficult for people to grasp about bitcoin, Villasenor said:

“The hardest thing — not unreasonably — is that bitcoin is completely decentralized currency. There’s nobody in charge, no company, no government, no consortium — collectively everybody acts to run it. Most of us grew up in a world where government has oversight over currency like the dollar.” 

To read more from his Q&A, go here.

 

Preventing Technology-Facilitated Exploitation

The UCLA Luskin Center for Innovation hosted a panel discussion on May 6 titled, Public Policy for Innovation in the Digital Age: Preventing Technology-Facilitated Exploitation. Information technology experts, social workers, security administrators, and researchers attended this sixth and final event of the 2013-2014 Public Policy for Innovation in the Digital Age series. The event featured speakers from the Federal Bureau of Investigation, The International Centre for Missing & Exploited Children (ICMEC), Microsoft, and Slate.

Ernie Allen, President and CEO of ICMEC, discussed how technology has changed the face of exploitation. He noted that the Internet has made illegal images of children relatively easy to share on a global scale, multiplying what were once isolated incidents. In addition, sexual abuse of children can occur digitally and remotely, using web cameras and accomplices.

Cody Monk, Special Agent at the FBI, agreed, described the Internet as a game changer that has taken away the barrier between victim and offender. Today, the victim and offender are now sometimes connected without the victim even knowing it, he added. The offender-to-offender trading industry of images and video online has also grown tremendously in the last 5-7 years. To address this rapidly evolving problem, Monk stated that the FBI is working with public and private partners to address current trends and to exchange information on how to innovate in preparation for future trends.

One of the important challenges in addressing these issues involves the role of anonymity. For political dissidents and journalists, tools that enable Internet anonymity can play a vital positive role in fostering change and in gathering and disseminating news. However, in the hands of sex offenders, traffickers and other criminals, these same tools can be used for criminal purposes.

Infiltration is a one technique used to identify criminal use of the Internet, but Allen stated that leading law enforcement experts around the world believe it can be time consuming, expensive, and sometimes ineffective. Alternatively, many law enforcement investigators wait for offenders to make a mistake, though, by definition, this means the more careful offenders often escape prosecution.

Adrian Chandley, Principal Program Manager at Microsoft, explained that Microsoft’s PhotoDNA software is able to capture information unique to an image in a fingerprint which can then be used to find other copies of the same image even if they have been resized or saved in a different format. PhotoDNA is used by industry, law enforcement and organizations such as ICMEC to identify copies of known illegal images of children, and can assist law enforcement investigators and leading child protection organizations in identifying child pornography victims. Fingerprints of child pornography images are distributed to Microsoft and other Photo DNA users who can then find and remove matching images from their systems. Chandley explained that one major problem for industry was that only fingerprints (or hashes) for images with identifiable victims are made available, and that the much larger set of hashes for all known child abuse images are not. These hashes should be used to prevent users from encountering these images.

Amanda Hess, Staff Writer at Slate, shared a personal story of digital exploitation that exemplified another type of behavior raising complex legal questions. She explained that an anonymous person created a Twitter account dedicated to slandering and making violent threats to her.  Hess had to decide whether to treat this situation as a criminal act, or to simply disregard it as offensive. She decided to call the local police, but they did not ultimately pursue the case.  Part of the issue was blurry jurisdictional lines: the offender was unlikely to live in the same state as Hess, and police would have needed to subpoena Twitter to find out his identity. Hess suggested that perhaps another reason for the lack of action was uncertainty about what constituents illegal online exploitation of women. Nearly everyone recognizes that the exploitation of children is wrong, she said, but sexist content and exploitation of women is part of some social norms, which presents an issue that goes beyond the online space.

Cody Monk explained that experiences like Hess’s, which happen far too often, underscores how critical it is that law enforcement at all levels share information on trends, policy, and technology in order to adequately confront technology-enabled threats in strategic and tactical ways.

Preventing exploitation is critically important, and Ernie Allen expressed his view that the biggest challenge to getting more attention and resources to address this issue preemptively is to educate and inform. Trafficking is often not reported by police and is a problem that the world does not see. An additional challenge, he said, is to convince people to engage in discussions about combating these crimes. Often, corporations and policy makers do not want to think about these unpleasant and horrifying issues. But without addressing the lack of suitable legal repercussions, and without more public education of technology-facilitated exploitation, the problem will only continue to grow.

To learn more about strategies to encourage the growth of the digital economy while preventing the sexual exploitation of children and other criminal activity, see the new REPORT by the Digital Economy Task Force (DETF), which was sponsored by ICMEC and Thomson Reuters and comprised of leading experts from government, the private sector, academia, and think tanks. These experts included Taskforce Co-chair Ernie Allen, John Villasenor of UCLA, and panelist Cody Monk of the FBI. The DETF worked to identify a regulatory framework that fosters the growth of the digital economy, including digital currencies and alternate payment systems, while addressing anonymizing technology and the growth of “deep web” marketplaces that allow illegal commerce.

See Photos from the lecture

Who is at Fault When a Driverless Car Gets in an Accident?

Autonomous vehicles are the talk of the town, especially since Google hosted its first-ever media event showcasing its “self-driving car” earlier this week. But while Google is demonstrating that the technology for autonomous cars exists, the question of liability remains.

Last year, several news outlets including the San Diego-Union Tribune and Wall Street Journal asked whether liability issues could stymy consumer access to autonomous cars.

In his latest paper for the Brookings Institution Center for Technology Innovation published in April 2014, Public Policy professor John Villasenor says this shouldn’t be the case. He argues that existing product liability law is well equipped to adapt to new technology and handle most of the issues that could arise.

Villasenor explains his findings in The Atlantic saying:

“Thanks largely to the tremendous technological change that has occurred since the middle of the last century, products liability has been a dynamic, rapidly evolving area of law. Notably, when confronted with new, often complex, questions involving products liability, courts have generally gotten things right…

…while the specific fact patterns will vary, in products liability terms, manufacturers of autonomous vehicle technologies aren’t really so different from manufacturers in other areas. They have the same basic obligations to offer products that are safe and that work as described during the marketing and sales process, and they have the same set of legal exposures if they fail to do so.”

The Washington Post noted Villasenor’s paper in their own analysis, which concludes that autonomous cars can be a boon to safety rather than a minefield of liability risks.

The New York Times also quoted Villasenor in their article about driverless cars and various issues like law-breaking and liability.

Villasenor concludes his paper by offering guiding principles for legislation that should and should not be enacted. To read his full paper, go here.

 

New Framework Debuts for Preventing Technology-Facilitated Exploitation of Children

The Digital Economy Task Force– sponsored by Thomson Reuters and the International Centre for Missing & Exploited Children (ICMEC) –released its report on the emerging digital economy and recommendations for policy makers, financial institutions, law enforcement and others to encourage its growth while preventing the sexual exploitation of children and other criminal activity. Task force leaders, including Vice Chair John Villasenor of the Brookings Institution and the UCLA Luskin Center for Innovation, released the report at the National Press Club on March 4 and then presented the findings to the U.S. Senate Committee on Homeland Security and Government Affairs on March 5.

The Digital Economy Task Force (DETF), which includes leading experts from government, the private sector, academia, and think tanks, was formed to help address a vitally important question: How can we foster the many benefits that today’s information technologies can offer, while simultaneously preventing those same technologies from being misused to exploit children?

Co-chaired by Ernie Allen, President/CEO of ICMEC, and Steve Rubley, Managing Director of the Government Segment of Thomson Reuters, the DETF worked to identify a regulatory framework that fosters the growth of the digital economy, including digital currencies and alternate payment systems, while addressing anonymizing technology and the growth of “deep web” marketplaces that allow illegal commerce, including money laundering, narcotics, weapons, stolen goods, human trafficking and sexual exploitation of children, and more.

“The digital economy and anonymizing technology hold great promise and societal value, from offering financial tools to the world’s unbanked, to protecting dissidents and journalists from unjust government reprisal,” said Rubley. “But these benefits are clouded by those who use the digital economy to commit illegal acts. While these are complicated issues, we believe that a regulatory framework can grow the digital economy – and confront those who seek to exploit it for illicit purposes.”

Recommendations from the report include: bolstering research into the intersection of the digital economy and illegal activities; increasing investment in law enforcement training and investigative techniques; enhancing cooperation between governmental agencies; the promotion of a national and global dialogue on policy; and more.

“The central challenge is Internet anonymity. There is an emerging ‘dark web’ that enables users to pay for their illegal transactions using digital currencies,” said Allen. “There is a difference between privacy and anonymity. We simply cannot create an environment in which traffickers and child exploiters can operate online with no risk of being identified unless they make a mistake.”

The DETF aims to educate the public and work collaboratively across stakeholder groups, including government agencies, law enforcement, corporations, academia, public and non-profit agencies, as well as key industry players. Task force members were selected from organizations including, but not limited to:

  • Bill and Melinda Gates Foundation
  • Bitcoin Foundation
  • The Brookings Institution
  • Luskin Center for Innovation at UCLA
  • Mercatus Center at George Mason University
  • The Tor Project, Inc.
  • United States Secret Service

The DETF launched in August 2013 and developed working groups to address the sweeping impact of these technologies from fostering financial inclusion to combating illicit activities. The focus areas for these groups included safeguarding human rights, regulation, inter-agency coordination and law enforcement.

DETF members John Villasenor, Ernie Allen, and other international experts will speak on the topic of “Preventing Technology-Facilited Exploitation” at the next event in the UCLA Luskin Center hosted Digital Tech series.

 

Q&A: John Villasenor, UCLA professor at the intersection of technology and policy

UCLA professor John Villasenor is an electrical engineer who teaches in the Henry Samueli School of Engineering and Applied Science and the Luskin School of Public Affairs. He is also a widely published writer on the intersection of technology and public policy, having written columns about drones, privacy and intellectual property, among other topics.

In an edited Q&A, UCLA Today’s Mike Fricano recently asked Villasenor how he became an expert in the public policy aspects of technology and why engineers should be part of that conversation.

To learn more, Click Here.