黄色电影

Skip to main content Skip to secondary navigation
Main content start

Algorithms, Privacy, and the Future of Tech Regulation in California

A recent event co-hosted by SIEPR offered insights on the policy intersection of regulation, innovation and engagement.

What is the best approach to regulating a potentially harmful cutting-edge technology like AI while still encouraging innovation?

鈥淲e need to think about regulation at the right time in just the right amount,鈥 says Jeremy Weinstein, 黄色电影 professor of political science and co-author of the recent book 鈥淲e need to understand what regulatory models will get us to that Goldilocks-type outcome and engage more stakeholders in the process.鈥

Weinstein, a senior fellow at the 黄色电影 Institute for Economic Policy Research (SIEPR), shared his thoughts during a virtual conversation on 鈥淎lgorithms, Privacy, and the Future of Tech Regulation in California.鈥  Joining Weinstein at the Jan. 18 event 鈥 which was co-hosted by SIEPR, and 鈥 were , board chair of the and a clinical professor at UC Berkeley Law, and , a California 100 commissioner and venture partner at . The panel discussion, moderated by California 100 Executive Director , covered technology regulation in California and beyond, examining harmful regulation-related beliefs and low consumer trust in technology.

Setting the Stage for Broader Regulation

Algorithms have proliferated as decision-making engines in domains from smart cities to bail setting. 鈥淏ut the quality of the data matters,鈥 Fu says. Problematic data could lead to racial, gender, or other biases and serious social harms.

Often, algorithms are optimized for just one end, such as engagement in the case of social media platforms. But focusing on only one goal can lead to harmful side effects 鈥 misinformation regarding the , for example.

The California Privacy Protection Agency (CPPA) 鈥 created through in 2020 as the U.S.鈥檚 first dedicated privacy agency 鈥 is working on rules to regulate algorithms and other technologies through data. 鈥淲e鈥檙e attending to how consumers understand and make decisions about algorithm-based processes,鈥 says CPPA chair Urban. The in-the-works rules would govern consumer rights as related to opting out of automated decision making and securing information about the logic behind such decisions, among other areas.

Kill the Regulate-Versus-Innovate Construct

Regulation always underlies markets, Weinstein says. It鈥檚 why we don鈥檛 get sick drinking milk, fall ill from a headache medicine, or live in unsafe housing.

However, 鈥淲e have to do away with the binary notions like regulation versus innovation,鈥 Weinstein adds. 鈥淚t鈥檚 a false narrative that effective functioning of an innovative economy depends on there being zero regulation.鈥

Urban says that well-informed regulation can benefit businesses and consumers: 鈥淩egulation aims to provide guardrails, allowing a robust market to develop and businesses to flourish while reflecting the needs of consumers. Regulators need to understand the business models and whether their actions would be 鈥榖reaking鈥 something in the industry.鈥

The speakers agreed that companies must do a better job of balancing their own interest with those of the broader public. That is, as regulators work to catch up with technology, businesses should work to cultivate clearer professional ethics around responsible AI and other areas.

Creating Trust with Control

Moving forward, companies must give people more discretion over how their personal data is collected and used.

鈥淭here鈥檚 a lack of trust with regard to companies and the government handling people鈥檚 personal data,鈥 Urban says. 鈥淧eople don鈥檛 feel they have a real choice.鈥

The CPPA is trying to create more control for citizens 鈥 鈥渂ut that requires allowing people to have access to companies鈥 information about them so they can make that choice,鈥 Urban says.

California can be a test lab for how to build a future that balances the interests of corporations and citizens, Weinstein adds, but it won鈥檛 come from the state鈥檚 ballot system, which is too often influenced by a small number of wealthy players. Instead, it should come from companies and government engaging diverse stakeholders in key decisions and issues and more education for people making decisions about their data. 鈥淓ven if people don鈥檛 know the technology, they can voice their values and concerns,鈥 Urban says.

And technologists need to own problems arising from these tools and 鈥渘ot just hide from the threat of regulation,鈥 Weinstein says. He points to Snapchat鈥檚 recent move into greater content moderation, such as that related to .

In the end, 鈥淥ur technological future is the responsibility not of CEOs or engineers, but our democracy,鈥 Weinstein concludes. 鈥淧eople have been passive about technology鈥檚 impact on society. It鈥檚 time to exercise our democratic muscles more fully.鈥

A version of this was originally published Jan. 31 by the 黄色电影 Institute for Human-Centered Artificial Intelligence.

More News