News

Bringing public values into technical systems

March 11, 2019
By: Kara Manke
A photo of Berkeley's South Hall
UC Berkeley joins a new university consortium dedicated to bringing public values into technical systems. (UC Berkeley photo by Steve McConnell)

From the news on our social media feeds to the GPS systems that get us where we need to go, our lives are increasingly being influenced by data and algorithms – and, as the proliferation of “fake news” can attest, turning our decision-making over to algorithms can be a risky business.

In an exclusive New York Times story today, Berkeley and 20 other universities announced they are forming a new consortium, the Public Interest Technology University Network (PIT-UN), to support not only the development of new technologies, but also to exploring what policies and safeguards must be put into place to ensure that they will work for the benefit of society.

Leaders from PIT-UN universities will meet regularly to trade ideas and best practices on how to foster programs that further research and education into the social impacts of data science and technology. The Ford Foundation and the Hewlett Foundation, in partnership with New America, helped to convene and form this network.

Berkeley News spoke with Deirdre Mulligan, associate professor in Berkeley’s School of Information and faculty director of the Berkeley Center for Law and Technology, about Berkeley’s long-standing commitment to supporting research and teaching that explores the social and political consequences of technology and develops strategies and methods to advance human rights and social welfare.

A headshot of Deirdre Mulligan
Deirdre Mulligan, associate professor in the School of Information at UC Berkeley

Berkeley News: How do you define public interest technology?

The public interest technology field is, by nature, interdisciplinary, and it includes social scientists, computer scientists, humanists and people from professional schools, such as Berkeley Law and the School of Information, who all are interested in research, education and service that aligns the design and use of technical systems with social and political values.

What are some examples of UC Berkeley research in this field?

Right now, I am looking at how lawyers and the legal system are adapting machine learning systems into their work practices. They are being used for things such as bail determinations or predictive policing, but they are also being used the context of discovery, to make decisions about what documents are relevant for disclosure. We want to know, “Are those tools going to be designed in ways that are sensitive to ethical and professional logics?”

We also have folks who are working on things like media manipulation and trying to figure out what are the right policies, what are the right design choices and what are the right literacy skills, so that we can create a set of interventions that, for example, ensure we don’t lose future elections to Russian trolls.

Berkeley has a long history of supporting programs that fall under the umbrella of public interest technology. Could you describe some of them?

In 2001, Berkeley launched the very first legal clinic in the world to focus on the public interest issues in technical systems, —  the Samuelson Law, Technology and Public Policy Clinic. At a research level, Berkeley faculty have been involved in important research, as well as policy debates, around electronic voting systems and their security, accountability and oversight.

In the past five years, we’ve seen an explosion of Berkeley programs that are even more closely focused on engaging students in technical fields to think about the social and political implications of their work and to give them tools to design for values—to design to support privacy or freedom of expression or fairness. This includes everything from the Human Rights Investigation Lab, the first university-based initiative to engage students in real-world open source investigation of human rights abuses and potential war crimes using cutting-edge tools and methods for analyzing and verifying social media content, to the Citizen Clinic, a project of the Center for Long Term Cybersecurity, the world’s first public interest cybersecurity clinic that brings students from multiple disciplines together to do capacity building for politically vulnerable organizations and communities to help them defend themselves against online threats.

And then we have schools like the School of Information that take a very interdisciplinary and problem-centered approach to learning, where we have a core technical component, a core social component and a core legal and policy component designed to develop well-rounded professionals who have the knowledge and skills to think about the implications of technical systems for society. The Algorithmic Fairness and Opacity Working Group, the Center for Technology, Society and Policy, the Center for Human Compatible AI, and the Center for Effective Global Action support cutting-edge research to address pressing social problems. Our new Division of Data Science and Information is moving in that direction too—it is not just about the computational tools, it’s not just about the data, it is about critical analysis of their social and political impacts.

How will the new Public Interest Technology University Network support UC Berkeley’s ongoing efforts in this field?

Many years ago, Ford and other foundations invested in the creation of a field of public interest law, and a component of that was funding legal clinics within law schools, because they understood the importance of developing a pipeline to support the education and training of public interest lawyers. If universities didn’t invest in creating curriculum, research opportunities and apprenticeships, law students would have a difficult time envisioning careers as public interest lawyers and seeing themselves as part of a bigger social justice community.

The Public Interest University Network is an effort to build a comparable field of public interest technology – it’s a response to the recognition that technical choices can have significant and long-lasting impact on peoples’ rights and opportunities and shape our economic, social and political life. We need people who have technical expertise and the capacity to assess the ethical, legal, policy and social dimensions of technological change. The goal is to design technical systems in a way that actually aligns with our values. The investment in building out a network that links the disparate activities occurring across different universities is essential to establishing the field. It elevates these activities into a broader narrative and starts to explain to people why building an educational pipeline that produces people with this set of knowledge and skills and sensibilities is necessary for society.

RELATED INFORMATION