Arthur Caplan’s important new EMBO Reports essay, “How stupid has science been?,” serves as a strong reminder that for decades, much of science saw public communication as a career liability – the Sagan Effect, because many viewed the famed astronomer with disdain for trying to explain science in simple, relatable terms.
That cultural snobbery mattered. It left many citizens unprepared to evaluate claims and ceded ground to the ill-informed, propagandists, and those seeking to manipulate others. Snobbery created a vacuum, but if we only focus on scientists’ failures, we’ll miss the larger structural forces now driving mistrust and politicization – forces that must be addressed directly if we want science (and our health and global competitiveness) to recover.
Start with the partisan shift in trust. Pew Research Center finds confidence in scientists remains higher among Democrats than Republicans, and the gap grew during and after the pandemic. The pattern shows that party identity shapes whether evidence is trusted – regardless of facts or how skillfully scientists explain it.
It’s worth recalling this wasn’t always partisan. America’s main science agencies were created through mid-century consensus, not by culture-war reactions. The CDC began in 1946 as a malaria-control effort that became the foundation of the nation’s public health system; the NSF was established in 1950 when President Truman created it for “research driven by curiosity and discovery”; the NIH’s history goes back to the 1887 Hygienic Laboratory.
Layer on active political interventions in the federal science apparatus. Recently, the White House fired CDC Director Susan Monarez weeks after she took office, prompting senior resignations and raising alarms about political control of public health decisions. The administration also proposed cutting NIH’s budget by roughly 40 percent (about $18 billion). Even if Congress moderates these proposals, they signal that federal science has become a partisan battleground.
Performance and integrity are vital for building trust. Americans remember serious ethical and governance failures – from the inhumanity of the Tuskegee syphilis study to the retracted and heavily debunked Wakefield paper linking vaccines to autism – because they shattered the expectation that science self-corrects quickly and transparently.
Finally, our news and information ecosystem supercharges falsehoods. A study of millions of tweets published in Science showed that false news spreads “farther, faster, deeper” than truth – mainly because people share novelty and outrage, and platforms amplify it. That’s a system issue; clearer language from scientists helps, but it can’t, on its own, beat virality.
Sure, gifted public communicators like Neil deGrasse Tyson, Brian Cox, Michio Kaku, Hank Green, and Jessica Knurick, along with training hubs like Stony Brook University’s Alan Alda Center, have entered the ring. Helpful, but not nearly sufficient. These efforts need to be multiplied a thousand, a million times over, and reach into every corner of the country.
So, yes: some snobbery helped create a vacuum. But now, polarization, policy shocks, long-term under-investment, governance failures, and algorithmic virality cause most of the damage. It’s a monumental problem, and we cannot wait for Congress to take meaningful action.
1. Scientists: make communication an integral, even required, part of the job, not just a side hobby. In a 2006 Nature Biotech article, I urged scientists and doctors to make science more understandable and accessible to the public. It’s not about dumbing things down; it's about keeping them simple and relevant. If your research is funded by the public, explaining and justifying it is a form of public service. Include a clear, plain-language summary on every paper and preprint; hold regular public Q&A sessions with schools, libraries, and faith groups; and include communication achievements in promotion and tenure dossiers.
2. Media: stop presenting “both sides" equally; focus on the strength of evidence. False equivalence turns established science into a debate show. Use proportional balance: give more coverage to claims backed by strong evidence and minimal attention to fringe assertions unless the story is about misinformation itself. As I’ve argued elsewhere in “False Equivalencies: The Danger of Treating All Information Equally,” giving equal time to unequal sides misinforms by design.
3. Schools should prioritize teaching how to think rather than just memorizing facts. We don’t need every student to become a scientist, but every citizen should be able to recognize evidence, uncertainty, and tradeoffs. Make the scientific method a regular part of education from middle school onward, pairing it with statistics, probabilistic reasoning, and media/digital literacy. Balance STEM with the humanities so students can understand ethics, history, and policy contexts.
4. Politicians: be the local leaders science needs. Create a standing science advisory group in your office composed of accomplished professionals from nearby universities and health systems. Hold regular evidence briefings on issues constituents face (e.g., climate change, mental health, vaccine recommendations, opioids, and air, water, and soil safety). When guidance shifts, announce it publicly and explain why. Leadership isn’t about having all the answers; it’s about showing your process.
The current politicization is (hopefully) temporary, not inescapable. We can reverse it, but only if we tackle and invest in all the causes – not just the easiest ones or those that offend the fewest political interests. We must be courageous and determined before we become sicker, poorer, and even more vulnerable.