Finding Signal in the Noise: A Trust Crisis in Medical Science

(This article was created in collaboration with Claude AI)

Key Takeaways:

Trust Has Hit Rock Bottom. Healthcare companies saw an 11-point drop in trust in 2023, according to the Edelman Trust Barometer. Americans now trust only themselves and their doctors while rejecting institutional authority.

Science Lost Its Context. During the pandemic, experts often failed to communicate uncertainty, providing recommendations without clearly explaining their level of certainty.

The Psychedelics Case Study Reveals Communication Challenges. The FDA’s caution on MDMA therapy highlights how regulatory conservatism and imprecise scientific messaging complicate legitimate research.

Rebuilding Trust Requires Humility. Scientists must acknowledge uncertainty and avoid “follow the science” messaging that ignores social realities.


At DOC 2024’s Saturday morning session on misinformation, three experts confronted an uncomfortable truth: the biggest threat to medical innovation might be the collapse of trust in legitimate science itself.

The numbers paint a stark picture. Trust in healthcare companies plummeted 11 points in the past year, according to Lynn Hanessian, then-Edelman’s Chief Strategist for Health. Americans have retreated into a defensive crouch, placing faith only in themselves and their personal physicians while rejecting broader institutional authority. 

“We as individuals now have made a very internal personal decision to trust just ourselves and our doctors and people like me,” she explained.

This creates a paradox that threatens the entire innovation ecosystem. The United States leads the world in developing breakthrough medical technologies, yet Americans are simultaneously the most fearful that these same innovations will harm them. 

Consumers worry not just about new treatments themselves, but about privacy invasions and receiving unwanted medical information. When people distrust the system, Hanessian’s research shows they reject beneficial innovations entirely—refusing to buy, advocate for, or even consider treatments that could help their families.

Dr. James Madara, then-CEO of the American Medical Association, traced much of this skepticism to communication failures during the pandemic. Health officials provided recommendations without clearly explaining their level of certainty, leaving the public confused when guidance inevitably changed as scientists gained a deeper understanding of a novel virus. “We failed to give context to the observations that we were making about the pandemic, for example,” he said. Officials never clarified that early recommendations were based on limited experience with an unprecedented situation.

This communication breakdown had real-world consequences, as Dr. Gul Dölen discovered when the FDA rejected MDMA-assisted therapy for PTSD. Even her educated colleagues immediately assumed corruption rather than considering the complex factors behind regulatory caution. 

Dölen, a University of California, Berkeley professor who holds the Renee & U.S. Marine LCpl Bob Parsons Endowed Chair, argues that the solution begins with abandoning marketing-friendly but scientifically imprecise language. “When we start trying to sell our ideas about these mechanisms around how these drugs are working, and we use imprecise terminology because it’s marketable,” she said. “This is the kind of bullshit that undermines our credibility.”

The experts found themselves grappling with fundamental questions about authority and expertise. One audience member challenged the entire framework, arguing that the real problem wasn’t misinformation but condescension — experts demanding trust while withholding the underlying data that would allow people to make informed decisions. 

This resonated with Dölen, who emphasized that science advances through open debate and willingness to be wrong, not through appeals to authority.

The pandemic revealed another blind spot: the tendency to privilege certain types of evidence over others. Madara described how public health officials followed “the science” while minimizing social science research showing the harms of prolonged school closures on children’s development.

As a model for better communication, the panelists pointed to weather forecasting — a field that successfully conveys uncertainty while helping people make daily decisions. Weather reports acknowledge limitations while providing useful guidance, creating public comfort with probabilistic thinking rather than false certainty.

University of California, Berkeley neuroscientist Jack Gallant, in the audience, noted that science itself bears responsibility for the credibility crisis. Research standards have declined, with journals publishing studies that meet only minimal statistical thresholds rather than demonstrating meaningful real-world effects. Many papers published in prestigious journals can reflect statistical noise rather than discoveries, flooding the field with contradictory findings.

The session ended without clear solutions. What is clear is that this isn’t just a messaging problem — it’s a deeper crisis about the role of expertise in a society where people increasingly refuse to be told what to think, even when the consequences might harm them.

Discover more from DOC

Subscribe now to keep reading and get access to the full archive.

Continue reading