Photo

Research Expertise and Interest

psychology, mental health, criminal justice, risk assessment, intervention

Research Description

Jennifer Skeem is a Professor of Public Policy and Social Welfare at the University of California, Berkeley. She is a psychologist who studies the intersection between mental health and risk of involvement in the criminal legal system. Her work provides empirical guidance on efforts to prevent violence, improve legal decision-making, advance forensic and correctional services, and achieve effective and equitable justice reform. Her most recent work focuses on the promise of serious videogames for preventing aggression and promoting positive development among young people in urban elementary schools. 

Skeem has authored over 150 articles and edited books that include, Applying Social Science to Reduce Violent Offending. She is past President of the American Psychology-Law Society and member of the. MacArthur Research Network on Mandated Community Treatment. Skeem has delivered congressional briefings on her work and consults widely with local, state and federal agencies on issues related to behavioral health, violence, risk assessment, juvenile justice and community corrections. 

In the News

Featured in the Media

Please note: The views and opinions expressed in these articles are those of the authors and do not necessarily reflect the official policy or positions of UC Berkeley.
July 17, 2020
Will Douglas Heaven
Location-based policing algorithms predict where and when crimes may happen, while others draw on data about people, and both unfairly target the Black community. Jennifer Skeem, professor of public policy at the University of California, Berkeley, and Christopher Lowenkamp, a social science analyst at the Administrative Office of the U.S. Courts in Washington, DC., looked at three different options for removing the bias in algorithms that had assessed the risk of recidivism for around 68,000 participants, half white and half Black. They found that the best balance between races was achieved when algorithms took race explicitly into account - which existing tools are legally forbidden from doing - and assigned Black people a higher threshold than whites for being deemed high risk.
February 19, 2020
Sophie Bushwick
Although both can be mistaken and biased, algorithms tend to be more accurate than humans at predicting which defendants would be more likely to be rearrested after their release from prison, a new study by Berkeley and Stanford researchers has found. The study followed up on a 2018 study finding that untrained humans did as well as a commonly used software program for forecasting recidivism. The new study repeated the first, with some changes in the methodology. Stanford social scientist Sharad Goel, one of the co-authors [along with public policy professor Jennifer Skeem], said of the new study: "The first interesting thing we notice is that we could, in fact, replicate their experiment. ... But then we altered the experiment in various ways, and we extended it to several other data sets." And with those added tests, the algorithms were more accurate than people. For more on this, see our press release at Berkeley News.
Loading Class list ...