The Question
The standard account of the nuclear age treats the existence and continuing salience of fission and fusion weapons as a settled physical and historical fact whose strategic implications are the only thing worth discussing. The settled-fact framing forecloses a different reading the documentary and physical record arguably permits — that the nuclear arsenal is, at minimum, a theater object of the crisis-object type whose deterrent function is a managed performance, and at the strong end, that several specific elements of the standard nuclear-history record contain anomalies the standard account is structurally unequipped to address.
The strong skeptic position — that no fission or fusion device has ever been detonated, that all imagery is fabrication, that the entire nuclear establishment is a multi-decade global hoax — is hard to defend against the convergent evidence streams: independent measurement chains in multiple non-aligned national programs, the radiochemical signatures of atmospheric testing in deep ice cores from the era, the seismic records of underground tests, the actual physical sites in nevada and kazakhstan and french polynesia and lop nur and the pacific atolls, and the testimony of personnel from programs that had no operational reason to participate in an american fraud. The strong position requires too many independent actors to coordinate a deception across cold-war lines for too long without leakage. It is unlikely to be correct.
What is much harder to dismiss is the weaker and more interesting set of claims that constitute the actually defensible structural reading: that the specific yields claimed for the hiroshima and nagasaki bombings are inflated relative to what the surface evidence supports, that the photographic record of the bombings contains anomalies the standard account does not address, that the strategic decision to memorialize hiroshima and nagasaki rather than the firebombings of tokyo and dresden — which killed comparable or larger numbers of civilians by conventional means — is a narrative-engineering decision whose effect is the production of nuclear exceptionalism as a category, and most importantly that the strategic deterrent doctrine the bombs are said to underwrite has functioned for eight decades primarily as a theater-state script rather than as an actual operational constraint.
What Was Done at Hiroshima and Nagasaki
The official account, in compressed form: at 8:15 a.m. on 6 August 1945, a uranium gun-type device with a yield of approximately 15 kilotons was detonated at an altitude of 580 meters over central hiroshima, killing an estimated 70,000 to 140,000 people in the initial blast and thermal effects, with comparable numbers dying in the following months from radiation injury and other consequences. On 9 August, a plutonium implosion device with a yield of approximately 21 kilotons was detonated over nagasaki, killing an estimated 40,000 to 80,000 in the initial event. The japanese surrender followed on 15 August.
The standard story is that the bombs ended the war and prevented an invasion of the japanese home islands that would have cost hundreds of thousands of american and millions of japanese lives. Gar Alperovitz’s Atomic Diplomacy, published in 1965 and revised significantly in 1995 with access to declassified documents, argued that the standard story is wrong on its central point: japanese surrender was already in motion by early august 1945, conditioned only on the preservation of the imperial institution; soviet entry into the pacific war on 9 august was the strategically decisive event from the japanese command perspective; and the bombs were dropped not to end the war, which was ending, but to demonstrate american capability to the soviet union for the postwar configuration that was already being negotiated. Alperovitz’s argument is documented to a level that the orthodox historians have not refuted, and the accommodationist historians (Barton Bernstein, J. Samuel Walker) have arrived at substantially the same conclusion via different paths. The bombs were a political signal at the threshold of the cold war, addressed to moscow rather than tokyo. The japanese civilians were the medium of the message.
If the bombs were political signals, the question of what physical event corresponded to the signal becomes relevant in a way the standard military-history framing does not invite. The signal worked regardless of the precise mechanism. A claimed yield, photographed mushroom cloud, and a flattened city are sufficient to deliver the signal. Whether the city was flattened by a 15-kiloton fission device, by a smaller device, by an extended firebombing reframed as a single-device event, or by some combination of the above, the signal was delivered and read by the intended recipient. The standard history conflates the question of what the signal contained with the question of what physical event produced the signal, and treats agreement on the latter as inferable from agreement on the former.
The Anomaly Cluster
A handful of specific empirical anomalies in the hiroshima and nagasaki record have been raised by various skeptical analysts over the decades. They are individually small and individually dismissable. As a cluster they are at least worth naming, on the principle that load-bearing convergent anomalies are the historical signal that the standard account is doing more rhetorical work than the underlying evidence supports.
The photographic record of the bombings is thinner than the cultural memory of the bombings would suggest. The canonical mushroom-cloud image is a small set of frames from a small set of cameras, several of which were taken from significant distance and at angles whose interpretation depends on assumptions about the actual altitude and yield. The ground-level imagery in the days following the bombings is dominated by what looks like generic firebombing damage: charred wood frames, melted glass, scorched concrete, the shadows of incinerated bodies on walls. The kind of damage one would intuitively expect from a nuclear-scale event — large vitrified zones, complete vaporization of materials in the immediate hypocenter, the kind of patterns that nuclear test sites in nevada produced under the documented conditions — is largely absent from the hiroshima and nagasaki photographic record. The standard explanation is that the bombs were detonated at altitude precisely to maximize blast damage rather than to produce the ground-level vitrification effects that would have characterized a surface burst, and this explanation is internally coherent. It is also unfalsifiable in the way the comparison invites.
The structural integrity of buildings near the hypocenter is a separate puzzle. Several reinforced-concrete buildings in central hiroshima — most famously the genbaku dome, the former hiroshima prefectural industrial promotion hall — survived the blast in conditions that are difficult to reconcile with the kiloton yield specified by the standard account. The dome is approximately 160 meters from the alleged hypocenter, well inside the lethal blast radius for a 15-kiloton air burst, yet stands as a recognizable structure with much of its frame intact. Standard explanations involve directional blast wave propagation, the building’s specific structural features, and the fact that the orthodox account itself cites the dome as the only major structure to survive at that range. The standard explanations are again internally coherent and again unfalsifiable in the way that matters.
Akira Lippit’s Atomic Light (Shadow Optics), published by minnesota in 2005, develops a reading of the visual culture of the atomic bombings that is not skeptical in the strong sense but is acutely attentive to the structural opacity of the photographic record — the way the bombings function in cultural memory as a photographic absence organized around a small canonical image set whose iconic status is doing the work of evidentiary completeness without supplying it. Lippit is a film theorist working in the cinema-studies register and his point is not the conspiracy point. His point is that the bombings are remembered through a visual economy of obscured presence that is anomalous in the history of war photography, and the anomaly is structural rather than coincidental.
The casualty figures from hiroshima and nagasaki range, in the published literature, from approximately 110,000 dead to over 300,000 dead, with the upper figures arrived at by including delayed radiation deaths over multi-year horizons. The lower figures, which correspond to the immediate blast and thermal effects, are within the range that the tokyo firebombing of 9-10 March 1945 produced — approximately 100,000 dead, more than 1 million displaced, 41 square kilometers of the city destroyed in a single conventional incendiary raid using napalm and white phosphorus delivered by approximately 300 B-29s. The dresden firebombing of 13-15 February 1945 killed an estimated 25,000 to 35,000 in the immediate event and is widely treated as a war crime. The hamburg firebombing of july 1943 killed approximately 37,000. The standard treatment positions the nuclear attacks as ontologically distinct from the firebombings — a different category of event requiring a different moral and historical vocabulary — and the difference is grounded in the implicit physical claim that the nuclear yield was something the conventional bombings could not match. The casualty figures for tokyo do not support the claim. The casualty figures support the reading that the nuclear attacks were of a kind with the conventional firebombings in their immediate human consequences, and the categorical distinction is a narrative production of the postwar period rather than a fact about the events themselves.
The radiation effects are the strongest physical signal that something nuclear-class actually occurred at hiroshima and nagasaki. The atomic bomb casualty commission — the joint american-japanese long-term study established in 1947 and continuing as the radiation effects research foundation — produced multiple decades of medical follow-up on the surviving exposed population, and the leukemia and cancer incidence rates in the hibakusha cohort show statistically significant elevation that is most parsimoniously explained by ionizing radiation exposure on the scale a kiloton-range fission event would produce. The strong skeptic has to explain the hibakusha data, and the strong skeptic typically does not. The weak skeptic does not have to explain it because the weak skeptic accepts that some kind of nuclear-class event occurred and is asking the more interesting question, which is what the event is for in the postwar narrative apparatus.
The Galen Winsor Problem
Galen Winsor (1927-2008) was a chemical engineer who worked at the hanford site, the oak ridge national laboratory, and several commercial nuclear reactor facilities over a thirty-year career inside the american nuclear industry. He held credentials and operational experience that are not easily dismissed. He developed, in the latter portion of his career and in his retirement, a position highly skeptical of the dominant framing of radiation as ubiquitously and linearly hazardous, and toured college campuses in the 1980s giving lectures during which he would, as a demonstration, drink water from spent-fuel storage pools to argue that the conventional radiation models grossly overestimated the hazard of low-dose exposure. He was treated as a crackpot by the regulatory establishment and the mainstream physics community, and was largely written out of the nuclear-policy conversation.
His specific empirical claim — that the linear no-threshold (LNT) model used by the regulatory apparatus to project cancer risks from radiation exposure is statistically unjustified at the low-dose end and produces wildly inflated risk estimates — has been substantially vindicated by the subsequent radiation hormesis literature. Edward Calabrese at the university of massachusetts has documented in detail how the LNT model was adopted by the U.S. national academy of sciences in 1956 through what looks like deliberate suppression of the contrary evidence in the hands of the genetics committee at the time, and the hormesis researchers — bobby scott, leslie redpath, ludwig feinendegen, and others — have produced cellular and animal-model evidence that low-dose radiation produces adaptive responses that may be net protective rather than net harmful. The LNT model is the basis of essentially all civil radiation regulation, and if it is wrong by the order of magnitude the hormesis literature suggests, then large portions of the nuclear-fear infrastructure — the chernobyl casualty estimates, the fukushima exclusion zones, the cleanup costs at hanford, the rocky flats litigation, the entire framework for thinking about reactor safety and radioactive waste — are also wrong by similar margins.
Winsor’s broader argument — that the nuclear-fear regime is not the protective response to a real hazard it presents itself as, but a manufactured ideological structure whose function is the suppression of nuclear power as a viable energy source and the maintenance of the petroleum economy that the suppression protects — is the kind of argument that is hard to evaluate because the manufactured-ideology component is structurally invisible to insiders who have absorbed the framing as common sense. Winsor was tracking the same kind of structural inversion that the theater-state frame describes at the political level: the apparatus that nominally exists to protect the population from a hazard is actually maintaining the salience of the hazard for purposes the population is not invited to inspect. Whether or not he had every detail right, the structural intuition is sound, and the subsequent literature on LNT has moved in his direction.
The Forty-Four-Year Script
The cold war nuclear standoff ran from approximately 1949, when the soviets achieved their first fission test, through the dissolution of the soviet union in 1991. Forty-four years. During this period, the official strategic doctrine was mutual assured destruction — the framework under which both superpowers maintained nuclear arsenals sized and configured to guarantee the catastrophic destruction of the other in any nuclear exchange, with the deterrent stability arising from the rationality of both sides recognizing the mutual catastrophe and refraining from initiation. The doctrine was articulated by herman kahn at RAND in On Thermonuclear War (1960) and Thinking the Unthinkable (1962), refined by robert mcnamara as defense secretary, and remains the implicit operating framework of nuclear policy in the contemporary configuration.
The documented near-misses during the cold war — the events at which a nuclear exchange came close to occurring but did not — are numerous enough to constitute their own historical sub-literature. The cuban missile crisis in october 1962 is the canonical case. The 1983 able archer NATO exercise that the soviets read as cover for an actual first strike. The 1979 NORAD computer error that briefly indicated a soviet launch. The 1995 norwegian rocket incident in which yeltsin had the russian nuclear briefcase open while the warning was investigated. Stanislav petrov’s 1983 individual decision not to escalate a soviet false-positive launch warning. The 1961 goldsboro B-52 crash in which a hydrogen bomb fell on north carolina with five of six arming mechanisms triggered by impact. The two near-disasters at thule in 1968 and palomares in 1966. The list runs to dozens of events at which, on the standard account, civilization came within minutes of catastrophic collapse and was saved by the calm judgment of individual operators or by mechanical luck.
The standard history treats this litany as evidence of the deterrent’s fragility and the heroism of the people who maintained it. The structural reading the theater-state frame permits is different: a system that produces dozens of close-call events over forty years and never tips over is either the most absurdly stable system ever designed by humans, or it is a system whose failure modes were never actually loaded with live ammunition in the way the close-call narrative implies. The professional cold war historians who have looked closely at the close-call literature — the most thorough is eric schlosser’s Command and Control (2013) — have produced accounts of the actual command-and-control reality that suggest that the live-fire risk during many of the close-call events was significantly lower than the dramatic retelling has it. The systems were equipped with multiple safety interlocks that were not, as the popular narrative has it, the last desperate barrier between civilization and oblivion, but were designed to fail safe in exactly the conditions the close-call retellings describe. The close-call narrative is part of the script. It produces the affective output the script requires — the audience experience of having been almost destroyed and saved at the last moment — without the script ever needing to be load-bearing.
The eighty years since hiroshima have produced zero subsequent nuclear use in war, despite the proliferation of the weapons to nine national programs, the major regional crises, the proxy wars, the terrorist groups whose stated objectives included acquiring and using nuclear devices, and the multiple non-state actors who have demonstrated capability with other weapons of mass destruction. The eighty-year non-use record is presented in the standard account as the deterrent working. It can also be read as a tell. A weapons category that no actor under any condition has ever found it operationally useful to deploy, even in the most extreme circumstances, is a weapons category whose actual operational role is something other than what the doctrine names. The doctrine names the role of last-resort kinetic effect. The historical record is consistent with the role of attention-engineering instrument, threat-grid maintenance, and theater-state crisis-object. The bombs do not need to be deployed to perform the latter functions. The fact of their existence, the official photographs, the doctrine literature, the close-call retellings, the air-raid drills of the 1950s, the doomsday-clock ritual maintained by the bulletin of the atomic scientists, and the periodic refresh of the fear cycle by mainstream-publishing-house nuclear-scenario books — all of these are the operational expressions of the actual function the bombs serve. The actual function is the maintenance of a particular structure of attention and affect in the populations whose behavior the apparatus needs to manage.
The Annie Jacobsen Function
Annie Jacobsen is the contemporary access journalist whose books — Area 51 (2011), Operation Paperclip (2014), The Pentagon’s Brain (2015), Phenomena (2017), Surprise, Kill, Vanish (2019), First Platoon (2021), Nuclear War: A Scenario (2024) — constitute the largest single body of mass-market defense-establishment journalism in english of the past fifteen years. Her access to the relevant institutions — DARPA, the joint chiefs, the national labs, the intelligence apparatus, retired officials at every level — is the load-bearing feature of her work. The access is also the tell. Journalists who write that close to the apparatus over that long a horizon without losing their sources are doing a job, and the job is not adversarial. The job is the laundering of the apparatus’s preferred threat models into mass-market paperback form, where the threat models are received by the audience as independent journalism and absorbed without the suspicion that institutional press releases would generate.
Nuclear War: A Scenario is the canonical recent example. The book takes the reader through a minute-by-minute fictionalized account of how a nuclear exchange would unfold, from a first launch through the cascade of automatic responses, the impossibility of off-ramps once the chain begins, and the effective end of organized civilization within hours. It is structured as a horror story and reads like one. Its sources are the strategic-command-and-control insiders Jacobsen has spent two decades cultivating, and its message is that the nuclear-deterrent system the cold war produced is far more fragile, far more on a hair-trigger, and far more dependent on the absence of a single mistake than the public has any sense of. The book’s reception treated this as an alarming revelation. The structural reading is that the book is a scheduled refresh of the fear-grid sector that the cold-war close-call narrative previously occupied and that has been allowed to drift in the public imagination during the post-1991 attention shift to other crisis objects. The bio-warfare project that has reportedly followed it is the same operation in the next sector. Nukes were the fear-grid anchor for the cold war. Bio-warfare is the fear-grid anchor for the post-2020 configuration. The journalist with the access cycles between the sectors as the calendar requires, and the audience receives the cycling as the urgent revelation of independent reportage.
This is not a claim that Jacobsen is a witting agent of disinformation. The journalists who perform this function rarely are. The relationship between an access journalist and the apparatus they cover is symbiotic in a way that does not require any particular instance of bad faith on the journalist’s part: the apparatus selects which journalists get the access, the access produces the books, the books reach the audience, the audience absorbs the threat model, the apparatus’s interest in the audience absorbing the threat model is served, and the journalist sincerely believes they have done independent reporting. The selection mechanism is the operation. The journalist is a participant whose participation is conditioned on her not being able to see the participation as participation, and the conditioning is a structural feature of the access relationship.
The Bottom Layer
Read in the theater-state frame, the nuclear arsenal is the canonical example of a crisis-object that has been maintained for eighty years through periodic refresh of the script, the narrative, the close-call literature, the disarmament-and-rearmament cycles, the missile-treaty negotiations and abrogations, and the journalistic apparatus that keeps the audience primed. The physical weapons exist. The yields they would produce if deployed are the empirical question the standard history has settled in the way it has settled. The deterrent doctrine is the script. The script has been performed for eight decades by actors on both sides who knew what kind of performance it was, and the audience has been kept in the seat through the periodic affective refresh that the close-call retellings and the mass-market doomsday literature provide. The bomb’s actual operating role in the international system is the role of theater-state crisis-object. Whether and to what extent the underlying physical capability matches the dramatic claims is a separate question that the structural reading does not depend on. Both the strong and the weak versions of the structural reading are compatible with the documented record, and the weak version is the one that holds without controversy.
The deeper esoteric reading the rendering-theory vocabulary supplies is that the idea of the nuclear catastrophe is one of the most ritually invested thoughtforms of the late industrial period — an egregore of unprecedented scale, fed by the daily affective contributions of populations who have been carrying around the imagined nuclear apocalypse as a background dread since they were old enough to absorb the cultural training that installed it. The egregore feeds on the dread. The dread is what the dread-engineering apparatus produces. The decommissioning of the egregore would require the withdrawal of the affective investment, and the withdrawal would in turn require the population to recognize that the affective investment has been a managed input rather than a rational response to a real and present threat. The recognition is the move the apparatus has been engineered to prevent, and the engineering has been overwhelmingly successful for eighty years.
References
- Alperovitz, Gar. The Decision to Use the Atomic Bomb and the Architecture of an American Myth. Knopf, 1995.
- Lifton, Robert Jay, and Greg Mitchell. Hiroshima in America: A Half Century of Denial. Avon Books, 1995.
- Lippit, Akira Mizuta. Atomic Light (Shadow Optics). University of Minnesota Press, 2005.
- Schlosser, Eric. Command and Control: Nuclear Weapons, the Damascus Accident, and the Illusion of Safety. Penguin, 2013.
- Kahn, Herman. On Thermonuclear War. Princeton University Press, 1960.
- Kahn, Herman. Thinking About the Unthinkable. Horizon Press, 1962.
- Calabrese, Edward J. “How the U.S. National Academy of Sciences Misled the World Community on Cancer Risk Assessment: New Findings Challenge LNT Model Foundations.” Chemico-Biological Interactions 308 (2019): 110-112.
- Calabrese, Edward J. “LNTgate: How Scientific Misconduct by the U.S. NAS Led to Governments Adopting LNT for Cancer Risk Assessment.” Environmental Research 148 (2016): 535-546.
- Winsor, Galen. The Nuclear Scare Scam. Recorded lecture, c. 1986. Available in transcript and video form through several online archives.
- Jacobsen, Annie. Nuclear War: A Scenario. Dutton, 2024.
- Dower, John W. War Without Mercy: Race and Power in the Pacific War. Pantheon, 1986.
- Selden, Mark. “A Forgotten Holocaust: U.S. Bombing Strategy, the Destruction of Japanese Cities, and the American Way of War from the Pacific War to Iraq.” The Asia-Pacific Journal 5, no. 5 (2007).
- Manly P. Hall, The Secret Destiny of America, on the symbolic apparatus underlying american imperial mythology. Philosophical Research Society, 1944.