Sean Carroll's Mindscape: Science, Society, Philosophy, Culture, Arts, and Ideas

335 | Andrew Jaffe on Models, Probability, and the Universe

November 10, 2025

Key Takeaways Copied to clipboard!

  • Scientific knowledge is fundamentally provisional and probabilistic, relying on the continuous sifting and comparison of provisional models rather than the achievement of absolute certainty. 
  • The Bayesian interpretation of probability, where all probabilities are conditional on a model, provides the necessary framework to address the problem of induction and compare competing scientific theories. 
  • Both statistical mechanics and quantum mechanics demonstrate that probability is an essential, perhaps fundamental, tool for describing the world, whether due to the complexity of many-body systems or the intrinsic nature of quantum reality. 
  • The shift in physics towards Bayesian probabilities has helped move the discussion away from the role of consciousness in quantum mechanics towards a deeper understanding of what probabilities themselves mean. 
  • The debate between Bayesian and frequentist interpretations of probability remains relevant, particularly when considering untestable theories like the cosmological multiverse, forcing reliance on prior beliefs for adjudication. 
  • Cosmology connects directly to quantum mechanics, as the standard model for early universe density perturbations relies on quantum mechanics, leading to discussions about the cosmological measure problem and the implications of eternal inflation. 

Segments

Models vs. Certainty in Science
Copied to clipboard!
(00:00:41)
  • Key Takeaway: Scientific knowledge is provisional and probabilistic, requiring the use of models to fit data rather than achieving absolute certainty.
  • Summary: The historical search for certainty in knowledge, like geometric proofs, fails when applied to empirical science about the actual world. Scientists must propose many possibilities (models) and sift them against data, acknowledging that all empirical knowledge is provisional. Even children build causal maps of the world, which scientists refine using sophisticated probabilistic tools.
Defining and Using Models
Copied to clipboard!
(00:05:27)
  • Key Takeaway: A model is a story about the world, often mathematical for scientists, that helps navigate reality by spelling out specific, useful relationships between different things.
  • Summary: We cannot know anything about the world without a model, which can be a story in words or math, enabling navigation and interaction. The London tube map serves as an analogy: it is isomorphic to the territory in specific ways (connections) but geographically inaccurate. Newton’s laws were highly successful mathematical models for interaction that were later superseded by Einstein’s more complex models in specific circumstances.
LLMs and the Chinese Room
Copied to clipboard!
(00:08:22)
  • Key Takeaway: The debate over whether Large Language Models possess a world model hinges on whether their functional output (phenomenology) is sufficient, echoing John Searle’s Chinese Room argument.
  • Summary: The Turing test focuses on observable behavior, but LLMs’ internal workings are complex neural networks, making it unclear if they possess a world model isomorphic to reality. Searle’s Chinese Room suggests a lookup table can process input and output correctly without genuine understanding. If an LLM’s output is functionally indistinguishable from thought, the difference between a lookup table and a model becomes philosophically blurred.
Deduction, Induction, and Probability
Copied to clipboard!
(00:14:52)
  • Key Takeaway: Inductive scientific reasoning requires making a leap of probability because knowledge about the world can only ever be known probabilistically, unlike deductive mathematics.
  • Summary: Deduction moves from true premises to true conclusions, which is mathematically rigorous but uninformative about the world’s actual workings. Induction requires making a probabilistic leap, meaning scientific models correspond to the world only probabilistically, dependent on the background information used. The problem of induction, highlighted by David Hume, shows that observing past events does not guarantee future certainty.
Bayesianism vs. Frequentism in Science
Copied to clipboard!
(00:30:01)
  • Key Takeaway: Bayesian probability is the answer to the problem of induction because it allows for updating credence based on evidence, whereas frequentist error bars describe hypothetical repeated experiments.
  • Summary: Bayesianism allows for assigning probabilities to unique, one-off events (like an election outcome), which frequentism struggles with. Frequentist error bars describe the performance of a procedure over infinite repetitions, not directly the probability of the true value being within that range. Bayesian probability is conditional on a model, meaning two people with the same data but different prior models will yield different, yet internally consistent, results.
Probability in Cosmology Data Analysis
Copied to clipboard!
(00:34:16)
  • Key Takeaway: Analyzing complex, single-instance data sets like the Cosmic Microwave Background (CMB) strongly favors Bayesian methods for linking probabilistic quantum origins to observable patterns.
  • Summary: The CMB analysis requires modeling early universe quantum fluctuations to predict the observed patterns (power spectrum), a process best handled by Bayesian inference. The current Hubble tension (discrepancy between CMB-derived H0 values around 67 and direct measurements around 72) highlights how model dependence affects cosmological parameters. The success of Bayesian methods in cosmology stems from their ability to marginalize over unknown instrument parameters while updating beliefs about the universe’s evolution.
Quantum Mechanics and Probability Interpretations
Copied to clipboard!
(00:58:13)
  • Key Takeaway: Quantum mechanics appears fundamentally probabilistic, leading to competing interpretations like Many-Worlds or QBISM, all of which yield the same calculable predictions.
  • Summary: Quantum mechanics seems intrinsically random, contrasting with the potentially deterministic nature of classical statistical mechanics. The Copenhagen interpretation involves wave function collapse upon measurement, a concept that historically involved consciousness but is now often treated purely mathematically. Despite deep ontological disagreements between Many-Worlds (where probabilities are inherent to the universal wave function) and QBISM (where probabilities reflect agent knowledge), all major interpretations yield identical experimental predictions.
Bayesianism vs Consciousness Role
Copied to clipboard!
(01:05:50)
  • Key Takeaway: The adoption of Bayesian probabilities shifted focus from consciousness causing wave function collapse to understanding the nature of quantum probabilities themselves.
  • Summary: The physics community’s prior frequentist approach often centered the measurement problem on the role of a conscious observer causing wave function collapse. The subsequent emphasis on Bayesian probabilities reframed this, suggesting probabilities are simply tools for understanding, not necessarily fundamental physical entities. Probabilities are viewed as being ‘just in our minds’ but are necessary because humans are the agents performing physics and interacting with the world.
Cubism and Many Worlds
Copied to clipboard!
(01:07:34)
  • Key Takeaway: Cubism is conjectured to be an effective coping strategy for physicists resistant to accepting the Many-Worlds Interpretation, though agreement on Bayesian probability bridges the gap.
  • Summary: The speaker conjectures that cubism serves as a temporary strategy for those unwilling to accept the Many-Worlds view. A debate with David Merman, a proponent of cubism, revealed common ground: both sides accepted subjective, Bayesian probabilities regarding which world one inhabits. Both major interpretations (Cubism and Many Worlds) currently lack testable predictions beyond standard quantum mechanics, making adjudication difficult.
Cosmology and Quantum Origins
Copied to clipboard!
(01:09:57)
  • Key Takeaway: The standard model for cosmic density perturbations relies on quantum mechanics in the early universe, linking fundamental physics to observable cosmology.
  • Summary: The analysis of fluctuations in the Cosmic Microwave Background (CMB) relies on quantum mechanics as the source of early universe density perturbations. This connection forces consideration of ‘outre’ cosmological problems like the multiverse and the cosmological measure problem, which arise from theories like eternal inflation. The existence of a multiverse allows some physicists to adopt a frequentist approach by counting existing universes, whereas Bayesians can still operate without assuming an ensemble of universes.
Anthropic Principle and Fundamental Constants
Copied to clipboard!
(01:14:00)
  • Key Takeaway: Our existence provides data (anthropic information) about the universe’s constants, but determining if these constants could have been physically different requires a comprehensive underlying theory.
  • Summary: The anthropic principle dictates that our existence provides information about the universe’s specific physical constants, such as Planck’s constant and the speed of light. Accessing models more comprehensive than the Standard Model of particle physics might reveal whether these constants are fixed or if processes like inflation can generate universes with different parameters. String theory, for instance, suggests that multiple realizations of the theory, differing in fundamental constants, are possible.