News

Disinformation Is Breaking Democracy. Berkeley Is Exploring Solutions.

November 30, 2023
By: Edward Lempinen

In fields from computer science and journalism to public policy and national security, scholars are working to advance online information integrity.

Two people sit cross-legged on the floor of a library, smiling as they read a book
One way to inoculate communities against disinformation, Berkeley scholars say, is to give people a strong voice in governing. That can help to keep them empowered and grounded in reliable information — and it can build trust. Used with permission from Kate Sadowsky at UC Berkeley Possibility Lab

Do a quick review of the top news for any day in the past 10 years and you’ll likely find that disinformation — barely disguised and often overt — has been a constant, powerful driver of political and social conflict in the U.S. and worldwide.

image of US map that says "democracy in distress"
Berkeley News is examining threats to U.S. democracy in a series drawing on the expertise of UC Berkeley scholars.

Consider a span of just a few weeks earlier this year: Fox News agreed to pay nearly $800 million to Dominion Voting Systems after it had falsely accused Dominion of aiding election fraud to swing the 2020 election to Democrat Joe Biden. A manipulated photo that seemed to show a bombing at the Pentagon emerged on Facebook and spread so fast that it sent shudders through the stock market. A study found that Twitter posts linked to the far-right QAnon conspiracy cult had nearly doubled. And fake photos of Pope Francis in a stylish white puffer jacket emerged on Reddit, then went viral.

image of the Pope wearing a puffer jacket
Apparent photos of Pope Francis wearing a stylish white puffer coat became a sensation after they emerged on the Reddit social media site earlier this year. But the photos were fake, generated by an artist using an artificial intelligence program.   Midjourney/Reddit

The pope in a puffer? Naniette H. Coleman, a Ph.D. student in sociology at UC Berkeley, admits she fell for it and shared it with friends. Which is remarkable, because Coleman has spent much of her academic career mentoring students and advancing projects focused on information integrity.

Disinformation can find "an in with all of us — a video, a visual coming from someone we think we can trust," Coleman said in a recent interview. "It's hard to tell the difference between real and fake right now. And that puts us all at risk. There's no one who's immune. We're all going to end up petting a rabid, frothing dog at some point."

Coleman is among an emerging network of Berkeley scholars — in fields from computer science and journalism to public policy and national security — who are exploring solutions that could, in time, reduce its corrosive impact.

Many of their efforts share a key objective: rebuilding trust among information consumers so that disinformation gets less traction. Among the leaders in this effort is Janet Napolitano, former U.S. secretary of homeland security, former UC president, and founder of Berkeley’s Center for Security in Politics.

"The basic effect of disinformation is to erode trust — trust in the validity and credibility of the of the media, trust in the validity and credibility of government and government leaders, trust in the validity and credibility of the political process," Napolitano said. "Disinformation goes right to the heart of that."

The scholars offer no grand initiative to blunt disinformation, no single path to rebuilding trust. Instead, they described the importance of education: small but ambitious initiatives to increase understanding and connection and to reduce mistrust and polarization.

1. Help students at every level to build awareness and skills.

headshot of woman
Amy E. Lerman. Richard Koci Hernandez

At a university, the most fundamental venues for education are the classroom and the lab.

"The big thing that universities can provide, that makes us valuable — we can teach people how to think critically, how to vet information for its source and its accuracy," said political scientist Amy E. Lerman, director of Berkeley’s Possibility Lab. "We need to really think about how we teach people to be consumers of information, because it isn't just knowing the information anymore that’s going to be required. It’s knowing how to know if the information is reliable."

Timothy Tangherlini, a professor in Scandinavian studies, takes a novel approach to folklore: He uses advanced computational analysis to study how stories circulate in a culture and connect people to ideology. Now he and his students are studying how misinformation and disinformation on the right-friendly social media site Parler led users to a consensus about 2020 voting fraud in advance of the Jan. 6, 2021, attack on the U.S. Capitol.

In the visualization of a dataset, a constellation of brightly colored dots, circles and lines represent social media narratives about fraud in the 2020
A dataset visualization developed by Timothy Tangherlini and his students shows how key narratives about fraud in the 2020 presidential election coalesced and converged on the right-wing social media site Parler in the weeks after the election. Courtesy of Timothy Tangerherlini

Jonathan Stray is a former journalist and now a senior scientist at the Berkeley Center for Human-Compatible AI (CHAI). He has researched and written extensively on social media algorithms that steer users to content featuring outrage, conflict and extremism. He’s advocated for algorithms that instead promote dialogue, peacebuilding and democratic values, and now he’s teaching a course called "Designing Algorithmic Media."

Hany Farid, a professor with a joint appointment at the School of Information (I School) and in the Department of Electrical Engineering and Computer Sciences, is a globally influential expert in how advanced technology can create "deepfake" images and other disinformation. He’s testified before the U.S. Congress and given White House presentations, but last spring he explored some of these issues at Berkeley with students in his introduction to computer science course.

The CITRIS Policy Lab and the Goldman School of Public Policy this fall launched the UC Berkeley Tech Policy Fellowship, with a cadre of nine accomplished mid-career policy leaders from public and private enterprise. Reducing disinformation is a key focus of the program.

2. Keep students focused on the real-world impacts of disinformation.

image of two women passing paper out
Naniette Coleman (left), a Berkeley Ph.D. student in sociology and leader of the Coleman Lab, and one of her students visited Wiki Education as part of cooperative project. Working with her guidance, students in the lab research and write pages on privacy issues that are posted on Wikipedia. Jami/Wiki Ed via Wikimedia Commons

In her Berkeley lab, Coleman works with student staff and volunteers, many of them people of color and women, using the tools of computational social science to focus on real world issues — including the effects of misinformation and disinformation.

She pointed to instances of violence inspired by false online narratives — including a May 2022 mass shooting in Buffalo, N.Y., in which a white, 18-year-old gunman murdered 10 Black people in a grocery market.

"This young man had read about 'replacement theory,' the conspiracy theory that people of color are here to replace white people," she explained. "He made horrific decisions, and he did incredible harm to a beautiful community, based on false information.

"So when I say it's a global pandemic, and it’s deadly, I don't think I'm being hyperbolic."

Shackled, wearing an orange jumpsuit and surrounded by law enforcement officers, Payton Gendron is led out of court in Buffalo, NY
Payton Gendron (center) was sentenced to life in prison after pleading guilty in the 2022 mass shooting that left 10 Black people dead at a grocery store in Buffalo, N.Y. Now 20, Gendron was moved to commit the crime after reading online about the "great replacement theory," which claims a high-level conspiracy to replace white Americans with people of color and immigrants. Matt Rourke/Associated Press

Working in partnership with Wiki Education, students in the Coleman Lab research and write pages on privacy issues that are posted on Wikipedia. Some of the students presented their work this year at a conference at the University at Buffalo in New York. Coleman is also the founder and organizer of a summer institute in computational social science at Howard University, a historically Black university in Washington, D.C.

3. Update and strengthen the practices of journalism.

The pandemic of disinformation in the U.S. has many causes, but the decline of conventional journalism is critical. In many small towns and cities, newspapers and radio stations long essential for local democracy have faded or disappeared following the rise of the internet.

Those dynamics give the Berkeley School of Journalism a central mission in addressing disinformation, said the school’s dean, Geeta Anand, a Pulitzer Prize-winning reporter.

headshot of woman
As dean of Berkeley Journalism, Geeta Anand has been a campus leader in developing classes and programs focused on limiting  disinformation and encouraging information integrity. Christopher Michel for the UC Berkeley Graduate School of Journalism

For decades, traditional journalism served as "a culture that supported a quest for facts and truth, and an ethos of fairness," Anand explained. "When advertising disappeared from that ecosystem and went to social media, that whole infrastructure tanked, and in its place rose social media. … People began to get their information from unreliable sources that had no pretense of trying to write things or relay information that was accurate or fair."

To address the threat of disinformation directly, Berkeley Journalism has offered classes in recent semesters on regulation of internet platforms, journalistic practices that can guard against disinformation, and the challenges of political reporting in the current environment.

The school recently named 40 early career journalists and deployed them to rural and small town newsrooms across California. They’re the first class under the California Local News Fellowship, a three-year, $25 million, state-funded initiative that aims to strengthen democracy by strengthening local news coverage.

Democracy cannot survive if there aren't local journalists holding local school boards and zoning boards and politicians accountable.

-Geeta Anand

In all, 120 early-career journalists will get two-year fellowships.

"Democracy cannot survive if there aren't local journalists holding local school boards and zoning boards and politicians accountable, at the local level, and covering their campaign finance reports and their school bond issues," Anand said. "Local journalism is fundamental to a democracy."

4. Reimagine governing structures to reduce political incentives for disinformation.

The U.S. political system is geared for two parties: It’s binary, us vs. them — and vulnerable to polarization. When cultural stress and economic stress drive political stress, polarization can turn fierce and politics can devolve into uncompromising tribal warfare. Disinformation becomes a tempting weapon.

headshot of woman
Berkeley political scientist Charlotte Hill argues for bigger U.S. congressional districts, each with representatives from different parties. That, she says, would encourage cooperation and reduce the temptation to use disinformation to get ahead. Sarah Deragon/Portraits to the People

In the view of political scientist Charlotte Hill, senior researcher for elections and voting policy at Berkeley’s Goldman School, that’s where the Republican Party is today, pulling the whole country into chaos.

Hill and Berkeley Ph.D. graduate Lee Drutman, now a senior fellow at the New America Foundation, founded Fix Our House, an organization to advocate for a new way of doing politics. Their ideas were featured last year in the New York Times.

Their preferred model is called proportional representation. New Zealand, Germany and other healthy democracies use the system, but the U.S. would need a new process for electing the 435 members of the U.S. House of Representatives.

Instead of electing one member from each district, Hill explained, Congress would be restructured to create fewer, but bigger, districts, each with multiple members. A party that won 60% of the votes in a district would hold 60% of the district’s seats. But even the losing party would hold 40% of the seats.

Within that district, most everyone would feel represented in Congress, and alternative parties might also be able to build influence. Hill believes the system would encourage more cooperation.

"You end up with coalitional politics where parties are competing in elections, but then they're having to form coalitions to put together a governing majority," she explained. "That’s an incentive for parties to not be completely demonizing one another and making every other party seem like they are liars and cheaters — because you might need to be governing with them in a few short months."

5. Develop national security tools to help battle disinformation from foreign adversaries.

Andrew Reddie’s project is a game — an advanced strategic wargame that tests how countries might protect alliances against hostile disinformation campaigns.

headshot of man
Andrew Reddie. UC Berkeley School of Information

The game, currently under development, is called (DIS)pute. It builds on the success of an earlier game called SIGNAL, which focused on nuclear deterrence. Like SIGNAL, (DIS)pute is being developed under the aegis of the Project on Nuclear Gaming, a collaboration involving Berkeley, Lawrence Livermore National Laboratory and Sandia National Laboratories.

Reddie is an associate research professor at the Goldman School; he is founder of the Berkeley Risk and Security Lab and serves at Berkeley’s Center for Security in Politics. He is also an influential expert in the development of war games that serve as profoundly serious national security exercises involving players at high levels of U.S. military and diplomatic strategy — and Berkeley undergraduates, too.

Reddie’s current wargaming work with Sandia focuses on a core strategic challenge made even more important by today's conflicts: How might the U.S. protect its alliances and inoculate them against the threats posed by disinformation?

“Particularly before Russia’s invasion of Ukraine,” he said, “one of the scenarios we worried about the most was a country like Russia trying to feed misinformation and disinformation into states like Latvia and Estonia, in order to drive a wedge in existing alliance relationship and drive pro-Russian sentiment.

Picking up where game theory leaves off, (DIS)pute examines the conditions under which states might embark on disinformation campaigns, their effects, and strategies to build resilience. Initially the game will be played in a seminar setting, Reddie said, but eventually it might be played online or as a board game, with each new play adding to data on how such conflicts might unfold in real life.

6. Reach across divides for deep discussion — and deep listening — about the state of the union.

While Reddie is working to avert war, Napolitano is leading an initiative to heal divisions and build peace.

Before joining the Goldman School faculty, she was president of the University of California, governor of Arizona and secretary of the U.S. Department of Homeland Security under President Barack Obama. She has thought deeply about how our politics divide us, and how such extreme division threatens the security of the nation and its people.

Janet Napolitano, a prominent U.S. national security expert now at UC Berkeley, talks with students and others during the
Janet Napolitano, the founder and faculty director of the Center for Security in Politics at UC Berkeley, spoke with students from Northern Virginia Community College, American University and Berkeley at an event earlier this year in Washington, D.C. "Uncommon Table" convened a diverse group of students and others to share insights about tensions in U.S. democracy. Jeffrey Watts/American University

Last spring, Napolitano led a group of 20 Berkeley students to Washington, D.C., for a week-long immersion in the U.S. security apparatus. But the visit also featured an unusual, ambitious exercise in bridge-building.

The event was called Uncommon Table. It brought together students and leaders from Berkeley, American University and Northern Virginia Community College.

The exercise was introduced by a panel discussion on polarization and disinformation. Then the students, spanning the political spectrum, were assigned to mixed groups sharing dinner. Each table was assigned to evaluate essential issues of democracy and the political health of the nation.

"The goal," Napolitano said before the event, "is to see whether in that conversation we can foster some common understanding and appreciation for questions like, What does being an American mean to you? What does a democracy need from you?"

She wrote about the results in The Hill, a top U.S. political news site, with co-organizer Amy K. Dacey, executive director of the Sine Institute of Policy and Politics at American University and former head of the Democratic National Committee.

"As we walked from group to group and heard the lively back and forth between the students reflecting both agreement and respectful disagreement," they wrote, "it was hard not to feel a renewed sense of hope and faith in the power of open communication and connection. …There was a spirit of optimism and hope that made us realize that what our nation needs is more of what we were seeing and hearing — more Uncommon Tables and more voices being brought together."

Napolitano and other organizers are planning new events and working to expand the project to other universities.

7. Use expertise from a variety of fields to build a stronger, healthier internet.

Many of Berkeley’s disinformation initiatives intersect in Our Better Web, a project formed last year. Anand provided the impetus, reaching out across diverse fields to top leaders: Napolitano and Farid, along with Erwin Chemerinsky, dean of Berkeley Law; and Brandie Nonnecke, director of the CITRIS Policy Lab and a faculty member at the Goldman School.

a graphic illustration with storm clouds on the outer border with blue sky and sun rays at the center, surrounding the name
Our Better Web convenes high-level Berkeley leaders to help guide public policy to support web platforms that are healthy and safe for people, communities and U.S. democracy. Photos by Anandu Vinod/Unsplash and Thomas Koukas/Unsplash; illustration by Neil Freese, UC Berkeley

Disinformation and other problems associated with the internet and social media can’t be solved just by one discipline, Anand said. "I suggested to them that we get together to try and solve this problem by putting our collective expertise to the problem — I thought that's what Berkeley could offer to the world," she said.

Nonnecke, now serving as director, laid out the mission shortly after the project was launched.

"We’re witnessing a sharp rise in disinformation, extremism and harmful content online," she said. "In order to effectively address these challenges, we must support rigorous training and research that can lead to effective technology and policy strategies to support a trustworthy Internet."

promotional image for Brandie Nonnecke's TecHype video series on emerging technology

In little more than a year, Our Better Web has produced research and analysis on disinformation and deepfakes, internet governance and the design of social media platforms. Nonnecke is exploring similar issues on TecHype, a new video and podcast series.

As part of a journalism class taught under the Our Better Web initiative, students took second place in a national competition for their radio documentary on Section 230, a law that protects internet platforms from liability for content on their sites. Critics say the law has spawned a pandemic of online disinformation.

8. Always look for ways to build trust — and trustworthy information.

"In order to move forward, we need to address trust and belonging in society — trust in each other and in our institutions," Nonnecke said. "And honestly, policymakers should be held more accountable for their purposeful creation of division in society as a means for them to get votes."

“Understanding the connection between information and trust is critical, especially in this current moment in history.”

 -Amy E. Lerman

At the Possibility Lab, Lerman and her colleagues are piloting projects that bring democracy into local communities, into neighborhoods, down to the street level. On issues such as police reform, public health and affordable housing, the projects are structured to involve people who are usually far from political decision-making and to give them a chance to explore issues, to speak and be heard.

Disinformation isn’t an obvious theme. But Lerman suggested that if people feel empowered, if they have a substantive stake in political decisions, they’re more likely to trust the process — and less vulnerable to disinformation and other sorts of manipulation.

"We've seen declining trust — not just in government, but in all kinds of institutions that used to be our trusted messengers," she said. "Understanding the connection between information and trust is critical, especially in this current moment in history."