IV — The Cost
Chapter 13: The Dead End
The most expensive intellectual enterprise in the history of theoretical physics began with an accident. In 1968, Gabriele Veneziano was not looking for a theory of everything. He was at CERN, trying to understand hadron scattering -- the messy, poorly understood collisions of protons and neutrons at energies where the strong nuclear force dominates. He discovered that the Euler beta function described certain properties of the scattering data. It was an elegant mathematical observation. It was not intended to explain the universe.
From that observation grew a programme that would, over the following four decades, consume the largest share of resources in theoretical physics. Thousands of researchers. The majority of theoretical physics faculty positions at the world's leading universities. Hundreds of millions of dollars in federal funding annually, flowing through the Department of Energy's Office of High Energy Physics and the National Science Foundation's theoretical physics programmes. The promise that justified this investment was the grandest in the history of science: a theory of everything. A single mathematical framework that would unify quantum mechanics and general relativity, explain all four fundamental forces, account for all particles, and in principle derive every fact about the physical universe from a handful of equations.
The outcome, after four decades, was zero confirmed experimental predictions.
Not "few." Not "disappointing." Zero.
That programme was string theory. Its rise, its capture of the field's institutional machinery, its specific failures, and its transformation from a promising line of research into a degenerating programme that structurally could not self-correct -- these are the subjects of this chapter. The analysis is institutional, not personal. The physicists who pursued string theory include some of the most talented minds of the twentieth and twenty-first centuries. The question is not whether they were brilliant -- they were -- but how a programme with no empirical confirmation came to dominate a field that defines itself by empirical confirmation, and what that dominance cost.
The answer connects directly to the fork documented in Chapter 3 and the institutional dynamics documented in Chapters 4 and 5. String theory did not capture the field by accident. It filled a specific vacuum. When the 1905 fork removed the physical medium from theoretical physics, the framework that replaced it eventually needed a unifying theory -- something to bridge the gap between quantum mechanics and general relativity that the ether, had it been retained, might have bridged naturally. String theory was, in a precise sense, the programme that existed because the ether programme could not. Its dominance was a downstream consequence of the 1905 decision, sustained by the same institutional mechanisms -- hiring, funding, prestige, career consequences -- that maintain the anti-ether orthodoxy to this day.
The two phenomena are not separate. They are the same phenomenon, viewed from different angles.
I. The Rise
The Accidental Origin
Veneziano's 1968 paper, "Construction of a crossing-symmetric, Regge-behaved amplitude for linearly rising trajectories," published in Nuovo Cimento, described hadron scattering using the Euler beta function. Within two years, Yoichiro Nambu, Holger Bech Nielsen, and Leonard Susskind independently recognised that Veneziano's formula described the dynamics of relativistic strings -- one-dimensional objects whose vibrational modes produced the scattering patterns Veneziano had observed. The original purpose was modest: to understand the strong nuclear force. No one involved was proposing a fundamental theory of nature.
The strong-force interpretation was short-lived. Quantum chromodynamics -- the gauge theory of quarks and gluons -- emerged in the early 1970s and rapidly proved to be the correct description of the strong force. String theory lost its original purpose. It should have died.
It did not die because Joel Scherk and John Schwarz noticed something remarkable. In 1974, they proposed reinterpreting string theory not as a theory of hadrons but as a theory of quantum gravity. The key observation: the spectrum of closed strings necessarily includes a massless spin-2 particle. The only known massless spin-2 particle is the graviton -- the hypothetical quantum of the gravitational field. If you took string theory seriously as a fundamental theory, gravity came for free.
The physics community did not take it seriously. Through the late 1970s, string theory was a backwater. Fewer than a dozen researchers worked on it worldwide. Scherk died in 1980 at the age of thirty-three. The programme appeared destined for obscurity.
The First Revolution
Everything changed in 1984. Michael Green and John Schwarz demonstrated anomaly cancellation for Type I superstring theory with the gauge group SO(32). Their paper, "Anomaly cancellations in supersymmetric D=10 gauge theory and superstring theory," published in Physics Letters B, showed that superstring theory could be quantum-mechanically consistent while incorporating gauge interactions -- the mathematical machinery of the Standard Model's force-carrying particles. The technical achievement was genuine: anomaly cancellation is a stringent requirement, and demonstrating it for a specific string theory was a significant advance.
The sociological effect was disproportionate. Within months, hundreds of physicists pivoted to string theory. Edward Witten, already among the most influential theorists in the world, declared it "the most important development in theoretical physics in the last twenty years." That endorsement, from that source, was itself a historical force. Graduate students, postdocs, and junior faculty across the world recalculated their career trajectories. The backwater became a flood.
The following year, 1985, Philip Candelas, Gary Horowitz, Andrew Strominger, and Witten showed that compactifying the six extra dimensions required by string theory on Calabi-Yau manifolds could yield four-dimensional physics resembling the Standard Model. The extra dimensions -- a feature that would, for critics, remain the programme's most extravagant feature -- were no longer merely a mathematical requirement. They seemed to lead somewhere recognisable. The number of possible Calabi-Yau manifolds, however, was already known to be enormous. This was noted at the time. It was not yet understood as a fatal problem.
The Consecration
In 1990, Edward Witten received the Fields Medal -- the most prestigious award in mathematics. He remains the only physicist to have won it. The honour was awarded for his applications of physics to mathematics: topological quantum field theories, contributions to knot invariants, applications of Morse theory. The mathematical achievements were genuine and are not disputed. But the effect on the physics community was something beyond the recognition of mathematical accomplishment. It cemented Witten's singular authority across both disciplines. He became, for a generation of theoretical physicists, the arbiter of what was worth pursuing. His interest in a topic redirected dozens of careers. His disinterest in a topic could consign it to irrelevance. The "Witten effect," as Smolin would later describe it, was the natural consequence of having one intellect of extraordinary power at the apex of a hierarchical field. But its consequences for the diversity of theoretical physics were profound.
M-Theory
At the "Strings '95" conference at the University of Southern California, Witten proposed that the five known consistent superstring theories -- Type I, Type IIA, Type IIB, Heterotic SO(32), and Heterotic E8xE8 -- were not separate theories at all. They were different limits of a single eleven-dimensional theory, which he called M-theory. The claim was electrifying. Five theories, previously seen as competing candidates for the fundamental description of nature, were revealed to be perspectives on a single underlying structure.
There was one difficulty. Witten did not define M-theory. He proposed its existence. He argued for it on the basis of various dualities connecting the five string theories. But he did not write down its equations, specify its fundamental degrees of freedom, or provide a constructive definition that would allow its properties to be derived. No one has done so in the three decades since. The "M" in M-theory was never definitively specified; Witten suggested it might stand for "master," "mother," "membrane," or "mystery." The last suggestion proved, in retrospect, the most accurate. M-theory remains, to this day, a conjecture about a theory that no one has been able to formulate.
The theoretical physics community nevertheless treated Witten's proposal as the "second superstring revolution." The five string theories were unified -- in principle if not in practice -- and the promise of a unique theory of everything seemed closer than ever. The failure to actually define the theory was treated as a technical challenge to be overcome, not as a fundamental objection.
AdS/CFT
Two years later, in 1997, Juan Maldacena published "The Large N Limit of Superconformal Field Theories and Supergravity." The paper proposed a duality -- the Anti-de Sitter/Conformal Field Theory correspondence -- between a gravitational theory in anti-de Sitter space and a conformal field theory living on its boundary. It became the most cited paper in the history of high-energy physics, accumulating over 23,000 citations by 2024. The achievement was genuine. AdS/CFT provided a concrete realisation of the holographic principle -- the idea, originating with 't Hooft and Susskind, that the information content of a region of space can be encoded on its boundary. It gave string theory its most powerful computational tool and opened connections to condensed matter physics and heavy-ion collisions that would prove independently fruitful.
The limitation was equally genuine, though less frequently emphasised. The correspondence applies to anti-de Sitter space -- a spacetime with negative cosmological constant. Our universe has a positive cosmological constant. It is not anti-de Sitter. It is approximately de Sitter. The extension of the correspondence to de Sitter space -- "dS/CFT" -- remains poorly understood and highly speculative. Maldacena himself has been careful not to overstate the correspondence's implications for the real world. The most powerful tool in string theory's arsenal describes a universe that is not ours.
II. Institutional Capture
The Hiring Data
The rise of string theory from 1984 onward was not merely intellectual. It was institutional. The programme did not simply attract researchers; it captured the machinery through which theoretical physics reproduces itself -- faculty hiring, graduate training, postdoctoral appointments, grant allocation, and prestige.
Lee Smolin, in The Trouble with Physics (2006), presents the most systematic data on this capture. He reports that of the tenured or tenure-track positions in theoretical particle physics at the top twenty US departments -- Harvard, Princeton, MIT, Stanford, Caltech, Berkeley, and their peers -- filled between approximately 1981 and 2005, the overwhelming majority went to string theorists. His count indicated that string theorists received roughly three to four times as many faculty positions in elite departments as all other approaches to quantum gravity combined. Among approximately thirty young theoretical physicists hired at top US departments during the period he surveyed, only a handful -- perhaps two or three -- worked on non-string approaches: loop quantum gravity, causal set theory, or other alternatives.
The exact numbers are difficult to pin down with precision. Smolin's methodology involved manual counting and judgement calls about what constitutes a "top" department and what counts as "string theory." But the qualitative picture -- overwhelming dominance of one programme over all others -- was not seriously disputed, even by string theory's defenders. The dominance was visible to anyone who looked at the faculty lists of elite departments in the 1990s and 2000s. String theorists did not merely have a plurality. They had a supermajority.
The Pipeline
The mechanism of capture was structural, requiring no coordination and no conspiracy. It was a self-reinforcing cycle operating through the ordinary machinery of academic reproduction.
String theory professors trained string theory graduate students. Those students, seeking postdoctoral positions and eventually faculty jobs, worked on string theory -- because their advisors worked on string theory, because the techniques they knew were string-theoretic, because the letters of recommendation that would advance their careers came from string theorists. The most successful among them became string theory professors, who sat on hiring committees, who evaluated job candidates through the lens of their own expertise, who hired more string theorists. Each stage of the cycle was individually rational. No hiring committee needed to conspire against alternatives. They simply hired what they knew, valued what they understood, and promoted the programme in which they had invested their careers.
The result was a monoculture at the top of the field. At the Institute for Advanced Study in Princeton -- the bellwether institution of theoretical physics, home to Einstein, to Oppenheimer, to a succession of the field's most consequential figures -- the permanent faculty in theoretical physics was dominated by string theorists. The IAS did not merely host string theory. It incarnated the programme's institutional authority.
The Funding Structure
Precise funding breakdowns between string theory and its alternatives are not publicly available in a form that permits clean comparison. The Department of Energy's Office of High Energy Physics and the NSF's Theoretical Physics programme -- the primary federal funders of theoretical high-energy physics in the United States -- categorise grants by broad programme area ("Theoretical Physics," "Mathematical Physics") rather than by specific approach. But the circumstantial logic is straightforward: since the majority of faculty at major research universities were string theorists, and since grants follow faculty, the majority of federal funding for theoretical high-energy physics flowed to string theory and string-adjacent research.
The alternatives survived on institutional margins. Loop quantum gravity, causal sets, and other non-string approaches to quantum gravity relied heavily on a small number of dedicated institutions, principally the Perimeter Institute in Canada -- founded in 2000 and funded primarily by Mike Lazaridis, the co-founder of BlackBerry -- and the Foundational Questions Institute (FQXi), founded in 2006 and funded primarily by the John Templeton Foundation. These institutions provided critical support, but their existence makes a pointed observation about the state of the field: the alternatives to string theory were sustained not by the scientific establishment but by a tech billionaire and a religiously affiliated philanthropy. The mainstream funding apparatus of physics -- the DOE, the NSF, the university system -- had effectively defaulted to string theory.
The Witten Effect
The term is informal but widely recognised among theoretical physicists. When Witten wrote a paper on a topic, that topic became hot overnight. When he lost interest, the topic often withered. Smolin describes the phenomenon explicitly in The Trouble with Physics. The mechanism was straightforward and, in its way, innocent: graduate students and postdocs seeking to build careers gravitated toward topics where jobs existed. Jobs existed where senior faculty had active research programmes and grant funding. Senior faculty followed the intellectual leaders of the field. And Witten was the intellectual leader.
The concentration of influence in a single individual was historically unprecedented for a living scientist. Witten held a permanent professorship at the Institute for Advanced Study, had won the Fields Medal, and combined mathematical power with physical intuition in a manner that inspired something approaching awe among his colleagues. His endorsement of a research direction was, for practical purposes, a necessary condition for that direction to be taken seriously in the theoretical physics community. This is Smolin's assessment, documented across pages 264-280 of The Trouble with Physics, and it was acknowledged informally across the community even by those who admired Witten -- which was virtually everyone.
The sociologist Harry Collins identified a "founder effect" in string theory: early practitioners trained students who trained more students, creating a self-reinforcing community with shared assumptions, shared techniques, and mutual citation practices. The Witten effect amplified this founder effect. One extraordinary mind, combined with the structural dynamics of academic hiring and funding, produced a field in which a single programme crowded out everything else -- not through malice, not through conspiracy, but through the ordinary operation of prestige, incentive, and institutional inertia.
Career Consequences for Alternatives
The obverse of string theory's dominance was the marginalisation of every alternative. Smolin documented this directly. Researchers working on loop quantum gravity, causal set theory, or emergent approaches to gravity struggled to find tenure-track positions at top departments. The pipeline that reproduced string theory also filtered out everything else. A graduate student who chose to work on loop quantum gravity under Carlo Rovelli at Marseille or Abhay Ashtekar at Penn State was making a career gamble that their string-theory-trained peers did not have to make. The letter-writers who mattered -- the senior figures at Harvard, Princeton, Stanford, the IAS -- were overwhelmingly string theorists, and they evaluated candidates by the standards of their programme.
This is the same mechanism documented in Chapter 5, applied to a different population. Chapter 5 examined the career destruction of individual dissidents: Bohm, Everett, Arp, Bell. Here the phenomenon is structural rather than individual. It was not necessary to target specific researchers for punishment. It was only necessary for the hiring and funding system to operate according to its normal logic: hire what you know, fund what you trust, promote what your community values. In a community dominated by string theorists, this normal logic produced systematic exclusion of alternatives as a side effect.
The comparison to the suppression of ether physics is direct. The paradigm maintains itself through hiring, funding, and prestige -- not through evidence. String theory's institutional dominance was not earned by experimental confirmation, because there was no experimental confirmation to earn. It was maintained by the same structural forces that maintain the anti-ether orthodoxy: a self-reproducing community of trained practitioners who evaluate all alternatives by the standards of the programme they were trained in, and who occupy all the positions of authority from which evaluations are made.
III. The Failures
Zero Confirmed Predictions
This is the central charge against string theory, and in four decades it has never been rebutted. String theory has not made a single prediction that was subsequently confirmed by experiment.
The statement requires precision. String theory predicts the existence of gravity -- the massless spin-2 particle in the closed string spectrum is identifiable as the graviton. But gravity was already known. A theory that "predicts" something discovered three centuries before the theory was formulated is offering a postdiction, not a prediction. String theory predicts supersymmetry. Supersymmetry has not been found. String theory predicts extra dimensions. No evidence for extra dimensions has been found at any scale. String theory, in its most specific early formulations, predicted a cosmological constant of exactly zero. The universe accelerates.
Peter Woit's formulation has become canonical: string theory is "not even wrong." The phrase, borrowed from Wolfgang Pauli's devastating dismissal of a colleague's work, captures the deeper problem. A theory that is wrong makes predictions that can be tested and falsified. String theory, in its current form, makes no predictions sufficiently specific to be tested. It fails to qualify as a scientific theory by the Popperian criterion not because its predictions have been falsified, but because it does not make predictions. The theory occupies a category more troubling than falsehood: it is unfalsifiable.
The Landscape
The problem crystallised in 2000, when Raphael Bousso and Joseph Polchinski published "Quantization of four-form fluxes and dynamical neutralization of the cosmological constant" in the Journal of High Energy Physics. They demonstrated that string theory admits an enormous number of metastable vacuum states -- distinct solutions corresponding to different possible sets of physical constants, particle content, and cosmological properties. Their estimate: approximately $10^{500}$ vacua.
Later estimates, using different counting methods -- exploring flux vacua on complex Calabi-Yau manifolds with large Hodge numbers, as developed by Michael Douglas, Washington Taylor, and others -- raised the figure to potentially $10^{272{,}000}$. The number itself scarcely matters. Whether the string theory landscape contains $10^{500}$ or $10^{272{,}000}$ possible universes, the implication is the same: the theory is compatible with essentially any set of low-energy physical constants. A theory compatible with anything predicts nothing.
Each vacuum in the landscape corresponds to a different universe with different physics. There is no known principle within string theory for selecting among these vacua -- no mechanism that picks out the specific set of constants, particles, and forces that characterise our universe. The landscape does not contain a signpost marked "you are here." It is a vast space of possibilities with no map and no compass.
As Woit has repeatedly emphasised: a theory that is compatible with $10^{500}$ possible universes cannot claim credit for describing the one we observe.
The Anthropic Retreat
Leonard Susskind's response to the landscape was audacious. In "The Anthropic Landscape of String Theory" (2003), and more fully in his 2005 book The Cosmic Landscape: String Theory and the Illusion of Intelligent Design, Susskind proposed that the landscape's vastness was not a bug but a feature. Combined with the mechanism of eternal inflation -- the cosmological process by which different regions of the universe undergo different inflationary histories -- the landscape implies that all vacua are physically realised somewhere in an infinite multiverse. We observe the physical constants we do not because string theory uniquely predicts them, but because we inhabit one of the (relatively rare) vacua compatible with the existence of observers. The anthropic principle selects our vacuum from the landscape.
This was, by any honest assessment, a retreat. The original promise of string theory was to derive the constants of nature -- to explain why the electron has the mass it has, why the cosmological constant has the value it has, why there are three generations of fermions. The anthropic landscape abandoned that promise. The constants are not derived. They are selected, after the fact, by the observation that we exist to measure them. The theory does not predict the constants; it accommodates them.
The reaction within the string theory community was divided. David Gross, a Nobel laureate and a committed string theorist, initially called the anthropic turn "giving up." He described the anthropic principle as a "virus" infecting the field. His language was not casual. "Giving up" is the precise characterisation: the anthropic landscape represents the abandonment of the traditional goal of theoretical physics -- to explain why the universe has the properties it has -- in favour of the assertion that no explanation is possible, because all properties are realised somewhere.
Gross later moderated his stance somewhat, but the division between the "swampland" camp (attempting to constrain the landscape) and the "multiverse" camp (embracing it) persists to this day. It represents a fundamental schism within string theory about whether the programme can still deliver on its founding promise, or whether that promise was always illusory.
Woit and Smolin, from outside the string theory community, articulated the implication with clarity: if any observation is compatible with the theory, the theory has zero predictive content. A theory with zero predictive content is not a scientific theory. It is a mathematical framework -- potentially elegant, potentially useful for other purposes -- but not a description of nature that can be tested against nature.
Supersymmetry Not Found
Supersymmetry was not merely a prediction of string theory. It was a structural requirement. Superstring theory -- the version of string theory that includes fermions and avoids tachyonic instabilities -- requires supersymmetry as a mathematical ingredient. If supersymmetry does not exist in nature, string theory as conventionally formulated cannot be correct.
The Large Hadron Collider was, among other things, a supersymmetry search machine. "Natural" supersymmetry -- the version that solves the hierarchy problem without reintroducing fine-tuning, as discussed in Chapter 12 -- predicted superpartner masses at or near the electroweak scale, accessible to the LHC. The accelerator began operations in 2010. The Higgs boson was discovered in 2012 at 125 GeV -- a triumph of the Standard Model. But no superpartners accompanied it.
The ATLAS and CMS experiments have searched for supersymmetric particles systematically across Run 1 and Run 2, and into Run 3. Current exclusion limits reach approximately 2-2.5 TeV for gluinos, approximately 1.5-2 TeV for first- and second-generation squarks, and comparable limits for other sparticles in various simplified models. Natural supersymmetry -- the version that was supposed to be there -- is in severe tension with experiment. String theorists have noted, correctly, that the theory does not strictly require superpartners at LHC-accessible energies. Supersymmetry could exist at scales far beyond the LHC's reach. But this is exactly the pattern of a degenerating programme: a prediction is made; the prediction fails; the parameters are adjusted to accommodate the failure; the theory is declared consistent with the new data. Supersymmetry at 200 GeV becomes supersymmetry at 2 TeV becomes supersymmetry at 20 TeV becomes supersymmetry at whatever energy the next generation of experiments cannot reach.
Extra Dimensions Not Found
String theory requires extra spatial dimensions -- six in most formulations, seven in M-theory. These dimensions are supposed to be compactified: curled up at scales too small to observe directly. Various proposals have placed the compactification scale at different energies, some accessible to the LHC (the large extra dimensions scenario of Arkani-Hamed, Dimopoulos, and Dvali, for instance, predicted signatures at TeV energies). No evidence for extra dimensions has been found at any scale, in any experiment. The LHC searches, tabletop gravity experiments probing sub-millimetre scales, and astrophysical observations have all returned null results. The extra dimensions may exist. But they have produced no observable consequence in any detector, at any energy, in the four decades since they were predicted.
The Cosmological Constant
String theory's relationship with the cosmological constant is a chronicle of failure accommodated by redefinition. In its most specific early forms, string theory predicted a cosmological constant of exactly zero. The mechanisms of supersymmetry breaking in string compactifications were expected to produce a vacuum energy that vanished precisely. This was a definite prediction, and it was falsified in 1998 when the supernova observations of Riess, Schmidt, Perlmutter, and their collaborators demonstrated the accelerating expansion of the universe -- requiring a small, positive cosmological constant.
The response was not to treat the falsification as a falsification. It was to produce, within a few years, the Bousso-Polchinski landscape construction, which showed that string theory could accommodate a small positive cosmological constant among its $10^{500}$ vacua. Susskind's anthropic argument then reframed the entire situation: the cosmological constant is small because it must be small for observers to exist, and string theory's landscape provides the ensemble of universes in which the selection occurs.
The sequence is instructive. String theory predicted Lambda equals zero. Experiment found Lambda greater than zero. String theory was modified to accommodate Lambda greater than zero. The modification required embracing $10^{500}$ vacua and the anthropic principle. At no point did the programme anticipate the experimental result. At every point, theory scrambled to catch up with observation. This is the precise signature of what Lakatos called a degenerating research programme.
Mathematical Self-Consistency -- Unproven
A claim frequently made on behalf of string theory is that it is the only known mathematically self-consistent theory of quantum gravity. This claim is, at best, misleading. String theory has not been proven to be mathematically self-consistent. There is no rigorous non-perturbative definition of the theory. The perturbative expansion -- the genus expansion of string worldsheets -- is known to be divergent, like QED's perturbative expansion, and no one has demonstrated that it is Borel summable or otherwise resummable into a finite answer. M-theory, which is supposed to be the fundamental formulation underlying all five string theories, has never been defined. It is known only through its various limits -- the five string theories and eleven-dimensional supergravity. The fundamental theory itself remains a conjecture.
Smolin puts the matter directly: "There is not yet a single complete formulation of string theory or M-theory. We do not even know the fundamental principles of the theory."
A theory whose equations have not been written down, whose fundamental principles are unknown, whose mathematical self-consistency is unproven, and whose non-perturbative definition does not exist is not a theory in the ordinary scientific sense. It is a collection of perturbative approximations, dualities, and conjectures arranged around a gap where the theory ought to be.
The Swampland Programme
The most recent chapter in string theory's evolution is the swampland programme, led by Cumrun Vafa and collaborators. Beginning with earlier work around 2005 and gaining significant momentum from approximately 2018, the programme attempts to identify conditions that effective field theories must satisfy to be consistently embeddable in a theory of quantum gravity. The swampland conjectures -- dozens of them, proposed by various groups -- aim to constrain the landscape, separating the small number of vacua that can arise from string theory (the "landscape") from the vastly larger set that cannot (the "swampland").
The most consequential conjecture -- the de Sitter swampland conjecture -- would, if true, rule out stable de Sitter vacua entirely. Since our universe appears to have a positive cosmological constant (a de Sitter-like spacetime), this conjecture creates a direct tension with observation. If stable de Sitter vacua are impossible in string theory, then string theory is either wrong or the dark energy driving the accelerating expansion is not a cosmological constant but a dynamical field (quintessence or similar). The internal debates are fierce. Different swampland conjectures sometimes conflict with one another. The field has not converged on a consistent set.
The programme has generated hundreds of papers and enormous activity. It has not produced a single confirmed prediction. It remains, as of this writing, entirely conjectural -- a programme of conjectures about conjectures, built on a theory that has never been defined, aiming to constrain a landscape whose very existence is disputed by some members of the community that created it.
IV. The Defence -- Steel-Manned
A chapter that merely catalogues string theory's failures would be incomplete and, more importantly, unconvincing. The strongest case against a programme is one that first acknowledges its genuine achievements and then demonstrates that those achievements do not save it. String theory has genuine achievements. They deserve honest presentation.
Mathematical Fertility
This is arguably the strongest argument for string theory's value. The programme has generated an extraordinary amount of new mathematics. Mirror symmetry -- the discovery that certain pairs of Calabi-Yau manifolds, though geometrically distinct, give rise to the same physics -- opened new directions in algebraic geometry and has produced results of lasting mathematical significance. Topological string theory contributed new invariants. Witten's work on knot theory and topological quantum field theories, which earned the Fields Medal, had its roots in string-theoretic thinking. The connections between string theory and pure mathematics -- modular forms, algebraic geometry, representation theory -- are deep, surprising, and real.
These achievements stand independent of whether string theory describes the physical world. They are mathematical contributions of permanent value. The honest response is that this is true and does not matter for the question at hand. Mathematical fertility is not the same as physical truth. Ptolemaic astronomy -- the geocentric model with its elaborate system of epicycles, deferents, equants, and eccentrics -- generated sophisticated mathematics. The calculations required to predict planetary positions in the Ptolemaic system were formidable, and the techniques developed to carry them out advanced computational methods for centuries. The mathematics was real. The astronomy was wrong. The Earth orbits the Sun.
The history of science contains numerous examples of incorrect theories that produced valuable mathematics. The caloric theory of heat, though wrong about the nature of thermal phenomena, stimulated the development of thermodynamics. The phlogiston theory, though wrong about combustion, organised chemical observations in ways that advanced the field. A research programme's mathematical byproducts do not validate its physical claims.
AdS/CFT
The Anti-de Sitter/Conformal Field Theory correspondence is string theory's most powerful result. It provides a non-perturbative definition of certain quantum gravity theories (in anti-de Sitter space), offers a concrete realisation of the holographic principle, and has been productively applied beyond its original context. The gauge/gravity duality that AdS/CFT exemplifies has been applied to the quark-gluon plasma produced in heavy-ion collisions and to condensed matter systems exhibiting strong coupling, including models of strange metals and high-temperature superconductors. These applications have produced genuine physical insights.
The limitation is fundamental and has been acknowledged by Maldacena himself: AdS/CFT applies to anti-de Sitter space, which has a negative cosmological constant. Our universe has a positive cosmological constant. It is not anti-de Sitter. The extension of the correspondence to de Sitter space remains poorly understood. The most powerful tool in the programme's arsenal provides exact results about a spacetime that is not the spacetime we inhabit.
This is not a minor technical point. It is a conceptual chasm. Anti-de Sitter space and de Sitter space have fundamentally different causal structures, different asymptotic behaviours, different symmetry groups. Results proved rigorously in one do not automatically transfer to the other. The claim that AdS/CFT demonstrates string theory's physical relevance requires bridging this chasm. The bridge has not been built.
Toolkit Development
String theory has contributed techniques that are used throughout theoretical physics, independent of whether the underlying theory is correct. The gauge/gravity duality has been applied to quark-gluon plasma physics. String-inspired methods have led to powerful new approaches to perturbative calculations in quantum chromodynamics -- the "amplituhedron" and related developments by Nima Arkani-Hamed and collaborators, though these are not strictly string theory, have conceptual roots in string-theoretic thinking. The programme has deepened understanding of black hole thermodynamics, particularly the Strominger-Vafa microscopic derivation of the Bekenstein-Hawking entropy for certain extremal black holes.
These contributions are real. And they are also independent of string theory's status as a fundamental physical theory. Techniques can be useful even when the framework that inspired them is wrong. The Ptolemaic system's mathematical apparatus for tracking planetary positions was useful to navigators for centuries after Copernicus. Toolkit development is a legitimate contribution, but it is not the contribution that string theory promised. The programme promised a theory of everything -- a unique description of nature at its most fundamental level. Offering useful computational techniques is not the same as delivering on that promise.
Non-Empirical Theory Assessment
The most philosophically sophisticated defence of string theory comes from Richard Dawid, whose String Theory and the Scientific Method (Cambridge University Press, 2013) argues that the traditional Popperian requirement of falsifiability is too narrow and that science sometimes legitimately proceeds through what he calls "non-empirical theory confirmation." Dawid identifies three arguments.
The first is the "no alternatives" argument: string theory, Dawid claims, is the only known consistent theory of quantum gravity, and the absence of viable alternatives itself constitutes evidence for string theory's correctness. The second is the "unexpected explanatory interconnections" argument: string theory has turned out to connect areas of physics and mathematics that were previously unrelated, and these connections are unlikely to be accidental. The third is the "meta-inductive" argument: theories in physics that have the structural properties string theory has -- mathematical elegance, internal consistency, unexpected unifying power -- have historically turned out to be correct.
Dawid's arguments deserve engagement rather than dismissal, because they articulate what many string theorists believe but rarely state explicitly. The response, however, is devastating on each count.
The "no alternatives" argument is factually false. Loop quantum gravity exists. It is a mathematically rigorous programme with a well-defined quantisation of geometry, a spectrum of area and volume operators, and a community of one hundred to two hundred active researchers. Causal set theory exists. Asymptotic safety exists. The ether framework, with its derivation of the Einstein equation from medium dynamics and its dissolution of the quantum gravity problem entirely, exists. The "no alternatives" claim survives only within a community that has defined "alternative" so narrowly that only approaches with the same structural features as string theory qualify -- which is to say, the claim survives only by circular reasoning.
The "unexpected explanatory interconnections" argument proves too much. Ptolemaic astronomy also exhibited unexpected connections between celestial observations and sophisticated mathematical structures. The interconnections were real. The cosmology was wrong. Internal mathematical richness does not guarantee external physical truth.
The "meta-inductive" argument is an appeal to track record: theories with properties like string theory's have historically turned out to be correct. But this requires specifying which properties. If the relevant property is "mathematical elegance," the historical record is mixed at best -- many elegant theories have been wrong, and many correct theories were initially considered ugly. If the relevant property is "lack of experimental confirmation after forty years," the historical precedent is not encouraging.
George Ellis and Joe Silk responded to Dawid's programme directly in their 2014 Nature article "Scientific Method: Defend the Integrity of Physics." They warned that relaxing empirical standards would undermine science itself: "the issue boils down to clarifying one question: what potential observational or experimental evidence is there that would persuade you that the theory is wrong and lead you to abandon it? If there is none, it is not a scientific theory." The "no alternatives" argument, Ellis and Silk noted, is not merely wrong as a matter of fact (alternatives exist) but wrong as a matter of principle: the absence of alternatives does not confirm a theory. It may simply reflect a failure of imagination -- or a failure of the institutional system to support alternatives, which is precisely what has occurred.
V. The Breakthrough Prize
In 2019, the Special Breakthrough Prize in Fundamental Physics -- three million dollars, split equally -- was awarded to Sergio Ferrara, Daniel Freedman, and Peter van Nieuwenhuizen for the discovery of supergravity. Supergravity, the locally supersymmetric extension of general relativity, was formulated in 1976. As of the award, forty-three years later, supersymmetry had not been experimentally confirmed. The LHC had spent nearly a decade pushing superpartner mass limits higher and higher. No superpartner, no gravitino, no supersymmetric particle of any kind had been observed.
Three million dollars for a theory with no empirical support after more than four decades.
The reaction illuminated the fault lines within the physics community. Supporters argued that supergravity was a profound theoretical achievement that had deeply influenced all subsequent work in high-energy theory, including string theory, which requires local supersymmetry as a structural ingredient. The mathematical contributions were real, they argued, and the prize recognised intellectual achievement independent of experimental confirmation.
Critics saw something different. Woit blogged about it as further evidence that high-energy theoretical physics had decoupled from the traditional scientific requirement of experimental confirmation. Sabine Hossenfelder, whose 2018 book Lost in Math: How Beauty Leads Physics Astray had argued that the field's reliance on mathematical beauty as a guide had led it into a cul-de-sac, saw the supergravity prize as confirming her thesis: the field now rewarded mathematical aesthetics rather than empirical success.
The symbolism extended beyond the specific award. The Breakthrough Prizes in Fundamental Physics, funded by Yuri Milner and other technology billionaires, had become the field's most lucrative honours -- dwarfing the Nobel Prize's purse. Their selection criteria were never explicitly defined in terms of experimental confirmation. But the supergravity award made the implicit standard explicit: in twenty-first-century theoretical physics, a sufficiently beautiful theory could command millions of dollars in prize money without producing a single testable prediction. The incentive structure of the field was now aligned not with empirical discovery but with mathematical achievement.
This is not an incidental point. Incentive structures shape behaviour. If the highest rewards in the field accrue to mathematical elegance rather than experimental confirmation, rational career-maximising behaviour leads away from testability and toward beauty. The Breakthrough Prize did not create this dynamic -- it was a symptom of a dynamic already entrenched -- but it made the dynamic visible in a way that the quieter machinery of grants, hiring, and tenure does not.
VI. The Verdict
Lakatos Applied
Imre Lakatos's methodology of scientific research programmes provides the most rigorous framework for assessing string theory's status. His central distinction is between progressive and degenerating programmes.
A progressive programme is one in which theoretical growth anticipates empirical growth -- where the theory predicts novel facts and some of those predictions are subsequently confirmed. Lakatos wrote: "A research programme is said to be progressing as long as its theoretical growth anticipates its empirical growth, that is, as long as it keeps predicting novel facts with some success."
A degenerating programme is one in which theoretical growth lags behind empirical growth -- where modifications to the theory are made only to accommodate known anomalies, where the modifications are ad hoc, where theory scrambles to catch up with experiment rather than leading it. Lakatos wrote: "A research programme is stagnating if its theoretical growth lags behind its empirical growth, that is, if it gives only post-hoc explanations of either chance discoveries or of facts anticipated by, and discovered in, a rival programme."
The asymmetry is critical. In a progressive programme, theory leads experiment. In a degenerating programme, experiment leads theory.
Applied to string theory, the assessment is unambiguous.
Novel predictions confirmed: zero. In fifty-eight years since Veneziano's amplitude and forty-two years since the first superstring revolution, string theory has not anticipated a single empirical fact and seen that anticipation confirmed. This is not a matter of dispute. Even proponents acknowledge it. The programme has generated enormous theoretical content -- thousands of papers, hundreds of new mathematical results -- but not one confirmed prediction about the physical world.
Ad hoc modifications: systematic. Supersymmetry was predicted at the electroweak scale. The LHC did not find it. The scale was pushed higher. The cosmological constant was predicted to be zero. The universe accelerates. The landscape was invoked to accommodate a nonzero value. Each failure of prediction was met not with reassessment of the programme but with modification of the protective belt -- precisely Lakatos's characterisation of degeneration.
Post-hoc accommodation: pervasive. When dark energy was discovered in 1998, string theorists rapidly produced landscape-based "explanations." When the Higgs boson was found at 125 GeV in 2012, the value was declared "consistent with" string theory -- but it had not been predicted. The programme has become so flexible that it can accommodate any observation after the fact, which means it can anticipate no observation before the fact.
The landscape as unfalsifiability. A theory compatible with $10^{500}$ possible sets of physical constants cannot be distinguished empirically from any rival theory -- or from no theory at all. If any observation is consistent with the theory, then no observation can falsify it, and Lakatos's criterion of excess empirical content is impossible to satisfy. The landscape does not merely make string theory unfalsifiable in practice; it makes it unfalsifiable in principle.
By Lakatos's formal criteria, string theory meets every condition for a degenerating research programme. Theory does not lead experiment; zero novel facts have been predicted and confirmed. Modifications are exclusively ad hoc, accommodating anomalies post hoc. The programme has survived for decades not through empirical success but through sociological and institutional factors -- funding, prestige, hiring, and the self-reinforcing pipeline of academic reproduction.
Lakatos himself issued a warning applicable to precisely this situation: "The direction of science is determined largely by human creative imagination and not by the universe of facts which surrounds us. Creative imagination is likely to find corroborating novel evidence even for the most 'absurd' programme, if the search is allowed." The implication, drawn by Lakatos, is that if the search is not allowed -- if a programme is suppressed before its novel predictions can be tested -- this constitutes a violation of rational scientific methodology. The institutional capture of theoretical physics by string theory did not merely promote one programme. It suppressed the conditions under which alternatives could be tested.
The Comparison
The ether framework, described in the preceding chapters of this book, presents a sharp contrast when assessed by the same Lakatosian criteria.
The monograph derives twenty-eight theorems — twenty-four principal — from a unified set of axioms concerning a physical medium. These derivations encompass the MOND phenomenology (Theorem 4.1), the Lorentz-invariant vacuum energy spectrum (Theorem 4.2), the Einstein equation of general relativity (Theorem 3.5), the Schrodinger equation (Theorem 7.1), and the connection between the Bohmian guidance velocity and the Nelson stochastic current velocity (Proposition 7.2). Each of these derives a known physical fact from the axioms of the ether framework, without postulating it independently. This constitutes theoretical unification -- explaining previously separate phenomena under a single framework -- which Lakatos values as a form of excess empirical content when it generates testable connections between domains.
More critically, the ether framework generates novel, falsifiable predictions. Theorem 8.8 predicts that the thermal degradation of Bell correlations follows an algebraic law with exponent 2, rather than the exponential decoherence predicted by standard decoherence theory. This is a specific, quantitative prediction that differs from the standard framework's prediction and that can be tested with current superconducting circuit technology. By Lakatos's strict criteria, a programme that generates specific novel predictions -- predictions that differ from those of the dominant programme and are testable with existing technology -- is theoretically progressive regardless of whether the predictions have yet been confirmed.
The contrast is stark. String theory: forty years, zero confirmed predictions, ad hoc accommodations of every failure, an unfalsifiable landscape. The ether framework: twenty-four derivations of known facts from unified axioms, plus novel falsifiable predictions. By the standards of the discipline's own philosophy of science, the rational allocation of resources between these programmes is not a close call.
The Most Expensive Degenerating Programme in History
The total investment in string theory -- measured in careers, funding, institutional infrastructure, and the opportunity cost of four decades of theoretical physics directed away from alternatives -- is incalculable in precise terms. But the order of magnitude is clear. Thousands of physicists devoted their careers to the programme over four decades. Each was funded by grants, postdoctoral salaries, and institutional overhead measured in hundreds of thousands of dollars per year. The number of person-years is in the tens of thousands. The financial cost is in the billions of dollars, when calculated across all countries, institutions, and funding streams.
The return on this investment, measured in its own stated currency -- predictions about nature -- is zero.
String theory is, by this measure, the most expensive degenerating research programme in the history of science. Not the most expensive failed experiment -- experimental failures, such as the null results of dark matter searches, at least produce data. Not the most expensive incorrect theory -- incorrect theories, if they make predictions, can be falsified and the knowledge gained from their failure has value. String theory is a programme that consumed the majority of theoretical physics resources for four decades and produced neither confirmed predictions nor definitive falsification. It exists in a category of its own: a programme too flexible to be wrong and too vague to be right, too dominant to be challenged and too entrenched to be reformed.
VII. The Structural Connection
String theory's dominance was not a random accident in the history of science. It was the logical consequence of a decision made a century earlier.
When the 1905 fork removed the physical medium from theoretical physics, the resulting framework -- quantum field theory on a Minkowski background, general relativity as a theory of dynamical spacetime geometry -- acquired a structural gap. The two pillars of modern physics were formally incompatible. Quantum fields require a fixed background; general relativity makes the background dynamical. Something had to bridge the gap.
If the ether had been retained -- if the physical medium had continued to be developed as the foundation of both quantum and gravitational phenomena -- the gap might never have opened. The ether framework, as the monograph demonstrates, derives both quantum mechanics and gravity from the same medium. The Schrodinger equation arises from stochastic diffusion through the medium (Theorem 7.1). The Einstein equation arises from the medium's acoustic dynamics (Theorem 3.5). There is no incompatibility because both phenomena are aspects of the same underlying physics. There is no "quantum gravity problem" because gravity is emergent from a medium whose quantum behaviour is already described.
Without the ether, the gap opened. It became, over the course of the twentieth century, the central problem of theoretical physics: how to reconcile quantum mechanics and gravity. String theory was, above all else, an attempt to close that gap. Its promise was the promise of a unified framework -- a single theory encompassing all forces and all particles. That promise was compelling precisely because the existing framework could not provide it. The vacuum left by the ether's removal demanded something to fill it.
String theory filled the vacuum. Not with a physical medium -- the taboo on the ether was too strong for that -- but with a mathematical structure of extraordinary complexity. Strings vibrating in ten or eleven dimensions, compactified on Calabi-Yau manifolds, generating the Standard Model's particle content as harmonics of a higher-dimensional instrument. The structure was beautiful. It was intellectually magnificent. It captured the imaginations of the finest theoretical minds of a generation. And it did not describe reality.
The institutional mechanisms that sustained string theory's dominance were the same mechanisms that sustained the anti-ether orthodoxy. The hiring pipeline that reproduced string theorists was the same pipeline that excluded ether-adjacent research. The funding streams that supported string theory were the same streams that dried up when a proposal invoked a physical medium. The prestige hierarchy that placed string theorists at the top of theoretical physics was the same hierarchy that placed ether physics outside the boundaries of acceptable discourse. The two suppressions -- of alternatives to string theory, and of the ether itself -- were not separate phenomena operating independently. They were the same phenomenon: a physics community locked into the 1905 fork, reproducing its commitments through institutional machinery, unable to explore the path not taken because the machinery had been built to prevent exploration of that path.
The irony is precise. String theory was the field's response to a problem created by the ether's removal. It was the most ambitious and expensive attempt to solve a problem that, in the ether framework, does not exist. Forty years of institutional capture, billions of dollars of investment, thousands of careers consumed -- all directed at a "problem" whose existence depends on the assumption that there is no physical medium. Restore the medium, and the quantum gravity problem dissolves. Restore the medium, and the need for string theory disappears. The programme that dominated theoretical physics for four decades was the downstream consequence of a philosophical decision that, the ether framework argues, was wrong.
VIII. Transition
The cost to theoretical physics is now documented across two chapters. Chapter 12 established the five unsolved problems: dark matter, the vacuum catastrophe, quantum gravity, the measurement problem, the hierarchy problem. Five questions spanning forty to one hundred years, consuming billions of dollars, producing zero resolutions within the standard framework. This chapter has established the programme that was supposed to resolve them -- or at least resolve the central one, quantum gravity -- and has shown that it consumed the field's resources for four decades while producing nothing testable.
Five unsolved problems. One dominant programme. Zero confirmed predictions. Billions of dollars. Tens of thousands of careers. A self-reinforcing institutional machinery that reproduced itself through hiring, funding, and prestige while systematically excluding alternatives. A philosophical decision made in 1905 that removed the physical element whose absence created the problems, sustained by the institutional mechanisms that prevented its restoration.
This is the cost to physics.
The cost in human lives is different. It is measured not in unsolved equations or wasted grants but in the technologies that were never developed -- the energy technologies, the propulsion technologies, the medical technologies that a physics built on the ether might have produced. The technologies that, had the 1905 fork gone differently, might have transformed human civilisation.
That cost is the subject of the next chapter.