The U.S. public's trust in scientists reached a new high in 2019 despite the collision of science and politics witnessed by the country. This study examines the cross-decade shift in public trust in scientists by analyzing General Social Survey data (1978–2018) using interpretable machine learning algorithms. The results suggest a polarization of public trust as political ideology made an increasingly important contribution to predicting trust over time. Compared with previous decades, many conservatives started to lose trust in scientists completely between 2008 and 2018. Although the marginal importance of political ideology in contributing to trust was greater than that of party identification, it was secondary to that of education and race in 2018. We discuss the practical implications and lessons learned from using machine learning algorithms to examine public opinion trends.
How to translate text using browser tools
2 March 2023
Polarization of public trust in scientists between 1978 and 2018 Insights from a cross-decade comparison using interpretable machine learning
Nan Li,
Yachao Qian
ACCESS THE FULL ARTICLE
It is not available for individual sale.
This article is only available to subscribers.
It is not available for individual sale.
It is not available for individual sale.

Politics and the Life Sciences
Vol. 41 • No. 1
Spring 2022
Vol. 41 • No. 1
Spring 2022
interpretable machine learning
polarization
public opinion
trust