More recently, though, this foundational idea of the modern environmental movement has itself been cast as a form of climate denial. In their landmark polemic Merchants of Doubt, the historians Naomi Oreskes and Erik Conway executed an ambitious reframing of the science of climate change, alleging that the central obstacle to climate stabilization was campaigns financed by the fossil fuel industries used PR tactics to artificially inflate the true uncertainty in scientific climate change data. Scientists and advocates created new scientific bases for climate action, such as the 350 Parts Per Million standard and two-degree temperature targets. Climate impacts were no longer seen as the consequence of centuries of industrialization but rather the criminal fault of a few polluting industries. And climate change, under the new science, is not a future risk but a present catastrophe. Nordhaus, the father of the carbon tax, now regularly comes under fire for his perceived lack of climate ambition (his work has purportedly “enabled climate change denial and delay.”)
In place of uncertain outcomes, this new generation of climate advocates offered the certainty of present events. Doing so has entailed significant exaggeration beyond the hard facts of climate science, including the overhyping of climate change’s contribution to present-day extreme weather events and the use of implausible warming scenarios that forecast dire future impacts. This transformation has had the intended effect of narrowing the solution set understood as “climate action.” Climate change under the new regime is not an emergent long-term risk to be managed by smart planning, expanding technological capabilities, and deepening societal resilience. Instead, it is an immediate physical threat, earning illicit perpetrators — so-called “fossil capital” and the ever-growing army of “climate deniers” — an increase in climate-related litigation and regulation.
***
Climate science as we know it began on June 23, 1988, when James Hansen testified before the United States Senate. Hansen was trained as a physicist — his early work involved radiative transfer in the atmosphere of Venus. At the time, he served as the Director of NASA’s Goddard Institute for Space Studies, where he and his colleagues compiled some of the first long-horizon global temperature data and developed key models linking rising temperatures to atmospheric CO2. Here for the first time was a government-employed climate scientist, warning the American public about the threat of climate change, in the Upper House of the U.S. Congress.
To be sure, the science of the global greenhouse effect and its impacts on human and ecological systems is over a century older than contemporary discourse would suggest.
A proper history would surely include the work of Joseph Fourier, John Tyndall, and Svante Arrhenius, the 19th century scientists who first posited and observed the atmospheric greenhouse effect, as well as Guy Stewart Callendar, the English engineer who compiled meteorological data to evaluate Earth’s changing land temperatures in 1938, and Charles David Keeling, who oversaw the first systematic records of carbon concentrations in the atmosphere at the Mauna Loa Observatory in Hawaii in the 1950s. Climate politics also predates Hansen’s famous testimony. The oceanographer Roger Revelle chaired an early subcommittee assembled to evaluate the effects of carbon dioxide on the atmosphere during the Johnson Administration. The economist William Nordhaus first proposed the now-infamous two-degree temperature target in a thought experiment in 1975. And the environmental scientist Jesse Ausubel helped organize the first United Nations climate conference in 1979.
But it was the late 1980s and early 1990s that the scientific community and the late-modern league of nation states dedicated themselves to limiting anthropogenic carbon emissions under the same international regulatory and diplomatic frameworks that had worked for chlorofluorocarbons, whaling, and other cosmopolitan victories over environmental vice.
In the same year as Hansen’s testimony, the World Meteorological Organization and the United Nations Environment Programme came together to found the Intergovernmental Panel on Climate Change. A body of the United Nations, the panel was meant to advise governments on scientific issues relating to climate policy. Two years later, preparations began for the first UN Earth Summit, to be held in Rio in 1992. Three years after that, the UN Framework Convention on Climate Change convened the first Conference of the Parties in Berlin, which would go on to craft global emissions-reduction commitments — first the Kyoto Protocol in 1997 and then, after Kyoto’s eventual collapse, the Paris Agreement in 2015.
The world did not pay much attention to Arrhenius’s model of the greenhouse effect in 1896, nor Revelle’s special advisory report for President Johnson. The more purely empirical science of climate change received little attention. But by the time of the Kyoto Protocol, climate change was front-page news.
The general public, in other words, came to understand climate science as the justification for an international regulatory system. As with the Montreal Protocol’s constraints on global chlorofluorocarbon emissions and the International Whaling Commission’s moratorium on commercial whaling, this science would undergird expert regulators’ framework for reductions of the carbon emissions that were, and still are, warming the planet.
In an important essay from 2007, the social scientists Steve Rayner and Gwyn Prins describe modern institutional climate science and policy as a product of “summitry” — a predilection for centralized global negotiations born of both the Cold War nuclear disarmament treaties and international environmental confabs on CFCs, whaling, and the like. As the pair write, “With the collapse of the USSR and the ending of the Cold War, the moment was ripe for another crusade.” But climate change, despite what some advocates claim, bears little resemblance to these earlier challenges. As Rayner and Prins put it, the Kyoto Protocol “was constructed by quick borrowing from past practice with other treaty regimes dealing with ozone, sulphur emissions and nuclear bombs which, while superficially plausible, are not applicable in the ways that the drafters assumed.”
Earlier global environmental risks, like lead in gasoline or chlorofluorocarbons, concerned relatively small volumes of trace compounds, often easily substituted for cleaner alternatives. But while carbon is a trace element in the atmosphere, it is the chemical foundation of life, thermodynamic science, and industry. Consumption of coal, oil, and natural gas, the dominant source of climate-altering greenhouse emissions, were responsible for about 90% of global primary energy supply in 1990, and 87% today. The sheer volumes at play are staggering. In 2024, the world produced about 9 billion tons of coal, 35 billion barrels of oil, and 145 trillion cubic feet of natural gas. And the environmental hazard is, if anything, larger in scale, since a kilogram of coal, oil, or natural gas emits more than a kilogram of carbon dioxide, as combustion binds the fossilized carbon to atmospheric oxygen.
By tying the problem of climate change to the same sorts of institutions that had managed the ozone layer, climate scientists and diplomats set themselves up for failure. International targets and treaties could not force the consumer sacrifices or overcome the vexing engineering obstacles necessary to ratchet down fossil fuel consumption.
This was how global publics became conscious of climate change. Then, as now, most respondents in surveys endorse mainstream climate science and support generic policies to confront warming. But they do not reveal a high willingness to pay for these policies. What would later become known as “climate denial” is real, but the original bug in the climate action code is the mismatch between the problem — the dozens of gigatons of greenhouse gases emitted annually by the fossil fuels enabling modern wealth, comfort, and civilization — and the proposed solution — science-informed diplomats assertively steering society to climate stabilization.
***
The failure in leveraging climate science to establish binding global greenhouse gas regulations quickly became impossible to ignore. Global emissions kept ticking up, dragging atmospheric concentrations and average global surface temperatures with them, despite scientific, diplomatic, and popular climate consensus. So a set of climate advocates set about to create new scientific frameworks.
Binding global emissions reduction targets had been policymakers’ goal since the early days of modern climate science. At the 1992 “Earth Summit” in Rio, diplomats signed a treaty that committed to stabilizing greenhouse gas concentrations in the atmosphere “at a level that would prevent dangerous anthropogenic (human-caused) interference with the climate system.” Though the treaty did not specify what that level was, it did formalize in international policy that such a level did exist. Future science and summitry would fill in the rest. 1997’s Kyoto Protocol, for instance, required that the world’s industrialized nations reduce their emissions by 5% below 1990 levels by 2012.
But Kyoto’s failure demanded a stronger, more certain line in the sand.
In 2008, Hansen convinced environmental activist Bill McKibben, who initially proposed an aspirational limit of 450 ppm of atmospheric carbon, to promulgate the 350-ppm target instead. McKibben founded 350.org, a new kind of climate activist organization. That target would earn an air of scientific credibility via the massively influential “Planetary Boundaries” paper published in the journal Ecology & Society in 2009.
These targets and boundaries have since been roundly criticized by physical and social scientists. Global warming is not like boiling a pot of water — there is no phase shift at 350 ppm, 450 ppm, or at any other point. Warming temperatures bring escalating risks, and while these risks are complex and not necessarily linear, neither do they advance in a step-wise fashion. Nonetheless, “Planetary Boundaries” has been cited thousands of times, and advocates and policymakers around the world routinely use threshold-based targets to define the seriousness of climate action.
With the Paris Climate Agreement in 2015, the two-degree target — originally proposed as a thought experiment by William Nordhaus in the 1970s — was codified in international treaty. But the Paris negotiators went further than that, stipulating that the long-term goal would include “pursuing efforts to limit the temperature increase to 1.5 degrees Celsius above pre-industrial levels.” Among other things, this meant commissioning the IPCC to produce a special report on the likely climate impacts of exceeding this new 1.5-degree target.
The best case for these fundamentally made-up targets is that they acted as a kind of synthetic Schelling point for global action. Even scientists who admit there is no physical basis for the 350-ppm or two-degree target will often argue that policymakers need some kind of goal. In this sense, the targets did the job — at least for a time.
“We have 12 years to limit climate change catastrophe, warns UN,” wrote The Guardian in 2018. This was and remains an obvious misrepresentation of the IPCC report, which found that 2030 was the approximate deadline for limiting warming to 1.5 degrees, an arbitrary target pinpointed by climate diplomats rather than the threshold of “climate catastrophe.”
Nevertheless, the target hit climate activists like a cortisol shot. The newly formed Sunrise Movement, which had hosted congressional sit-ins and courted the support of Senator Ed Markey and rising star Representative Alexandria Ocasio-Cortez, claimed that keeping global temperatures below 1.5 degrees Celsius was “critical.” So did the Sierra Club, the Natural Resources Defense Council, and Greenpeace.
Their activism, arguably, worked. The 1.5-degree target was chosen to emphasize immediate emissions reduction opportunities, since time was short to avoid crossing the threshold. And that’s exactly what policymakers pursued. Shortly after being inaugurated in 2021, President Joe Biden said that “We must try to keep the Earth’s temperature to an increase of 1.5 degrees Celsius.” His administration would go on to pass large climate investments, particularly through the Inflation Reduction Act. The bulk of the IRA’s spending was directed towards commercially mature technologies like solar photovoltaics, onshore wind, and electric vehicles in the belief that subsidizing off-the-shelf technologies could achieve near-term climate targets.
The other intended function of these targets has been to delegitimize the usage of fossil fuels. Climate activists have for decades called for limitations and bans on fossil energy production. These calls were, on occasion, effective — the Biden Administration paused approvals of new liquefied natural gas export facilities under pressure from climate advocates, and Vice President Kamala Harris famously had to renege on her 2019 campaign commitment to ban hydraulic fracturing of oil and gas in the United States.
But while these discrete efforts were major, they are nothing compared to the ambitions of the climate movement to write atmospheric targets into regulatory guidance and jurisprudence.
***
For a long time, climate change was widely understood as an intertemporal collective action problem: the risks will happen in the future, but the costs of mitigation are concentrated in the present. Carbon emissions have been rising for centuries, primarily as a side effect of fossil-fueled industrialization and land-use change, providing large benefits to humanity but also creating the risk of a destabilized climate. The coordination and resource-pooling required to overcome this kind of collective action problems are a classic justification in neoclassical economics for government intervention into the free market. This is what made the truth “inconvenient” to Al Gore, who argued that the threat of climate change “means we are going to have to change the way we live our lives.” Collective action was likewise a justification for a globally negotiated treaty, without which irresponsible countries could free-ride on their peers’ emissions reductions.
More recent climate science and advocacy has begun to reject this idea.
In 2007, the IPCC constructed four elaborate scenarios of future emissions and warming, called “representative concentration pathways”. The RCPs aren’t predictions — they’re standardized scenarios which exist to help climate scientists run comparable simulations. As such, they prioritize the needs of technical climate modellers over realism: as Roger Pielke and Justin Ritchie have painstakingly documented, the IPCC “did not consider the plausibility of the socioeconomic assumptions used to generate them.” The most aggressive of these scenarios is RCP8.5. Climate scientists have designated RCP8.5 as the “business-as-usual” scenario, despite the fact that it assumes implausible future fossil fuel usage, and thus predicts large and unlikely negative outcomes. (As just one example, the scenario assumes a six-fold increase in per capita coal consumption by 2100. In fact, world coal consumption is expected to slow and even decline in the coming decades.) But treating RCP8.5 as a default meant that scientists could move from understanding climate change as the complex dynamic outcome of a radically uncertain techno-economic future to predicting with conspicuous certainty what climate impacts would look like 30, 50, or 100 years into the future. Thousands of papers are published every year based on the RCP8.5 scenario.
Then, in 2010, the historians Naomi Oreskes and Erik Conway published Merchants of Doubt, the best-selling polemic whose arguments provoked a sea change in climate advocates’ rhetoric and policy proposals. The book argued that climate science had been systematically suppressed by fossil fuel industry-funded scientists, corporations, and politicians. These industries — not the collective action detente of international negotiations or consumer hesitance to pay for non-emitting energy technologies — were the main obstacle to climate stabilization.
In a landmark gathering hosted by Oreskes and the Union of Concerned Scientists in La Jolla in 2012, advocates and scientists brainstormed a different kind of climate action. The workshop “sought to compare the evolution of public attitudes and legal strategies related to tobacco control with those related to anthropogenic climate change.” The strategic continuity between tobacco and oil interests is a major theme in Merchants of Doubt, but the workshop took this analysis one step further: their goal was to reframe climate change from a collective action problem into a question of tortious harm, like cigarette smoke. The legal strategy would be to sue fossil fuel companies, just as the U.S. government had successfully sued the tobacco producers.
Another outcome of the workshop was extreme event attribution. In order to shift public opinion on the right tactics to fight climate change, the group believed it was necessary to establish that climate change was causing concrete harm in the present. Workshop participants emphasized extreme heat and rising sea levels, but the focus soon expanded to natural disasters. Myles Allen, a pioneering climatologist in the field and a participant at the La Jolla workshop, had lamented that “the scientific community has frequently been guilty of talking about the climate of the twenty-second century rather than what’s happening now.” After La Jolla, that was going to change.
This was also a crucial step in the legal strategy of the climate movement. As Friederike Otto, the founder of World Weather Attribution — the leading academic collaboration in the field — somewhat baldly described it, “Unlike every other branch of climate science or science in general, event attribution was actually originally suggested with the courts in mind.”
This new branch of science is an explicit break with the methodology of the IPCC. For decades, the IPCC has tracked trends in the frequency and intensity of extreme weather phenomena. There is a signal in all that noise. Extreme heat and precipitation events — that is, heat waves and heavy rains — have become marginally, but measurably, more intense as a result of climate change. But in their most recent assessment report, the IPCC found no major shifts in global hurricanes, droughts, floods, or wildfires. Some of these weather patterns have changed in some regions, but these shifts by and large have not exceeded the range of natural climatological variability, nor have they passed statistical tests for significance at the global level. In other words, the international science community has not “detected” an increase in the frequency or intensity of these events at the global scale. Anthropogenic climate change is very real, it has detectable effects, and those effects will grow — but we don’t have enough evidence to conclude that it is exerting significant influence on natural disasters today.
This meant that activists needed a different statistical approach.
Extreme weather attribution instead considers not the trendline, but the datapoint. Analysts look at a single extreme weather event — say, a hurricane or a flood – and compare its real impact to the impact it would have had in a constructed counterfactual world where the exposed infrastructure and population were the same but carbon emissions had not accumulated in the atmosphere.
Doing so often produces eye-popping relative probabilistic estimates. One study found that a particular Indian heat wave was “made 30 times more likely by the climate crisis.” But if you look past the press releases, those same analyses reliably find the absolute shifts in the event to be more marginal. That Indian heat wave, for instance, was estimated to be about one degree hotter than it would have been in a preindustrial climate. This is basic probabilistic statistics — relatively minor absolute shifts in the system can result in numerically large shifts in the probability of outlier outcomes.
It’s an ambitious, and fairly absurd, epistemic pursuit to imagine a modern population enduring a counterfactual extreme weather event without the centuries of accumulated emissions from fossil fuels that produced populous technological modernity in the first place. Single-event attribution analysis performs a torturous statistical theater, comparing real-world empirics to unfalsifiable and one-dimensional counterfactuals, in order to produce large headline-making numbers.
As Oreskes, Otto, and other architects of these activist scientific frameworks hoped, the new science has significantly informed new legal and policy strategies.
In 2015, an advocacy group named Our Children’s Trust sued the federal government. Joined in amici briefs and petitions by the Sierra Club, the Sunrise Movement, and 250 other environmental organizations, the plaintiffs argued that it should be considered a violation of the U.S. Constitution for global CO2 concentrations to exceed 350 ppm. That suit, which Bill McKibben once called “the most important lawsuit on the planet,” was dismissed by a federal court, with an appeal denied by the Supreme Court this spring. But there are well over 1000 related cases filed by civil society groups and attorneys general, many of which use single-event attribution analysis as evidence of tortious harm. New legal frameworks proposed in states like California and Vermont have sought to create “Climate Superfund” law to extract financial penalties from the fossil fuel industries. And single-event attribution has become a common technique within scientific and mainstream media publications on climate science.
***
Climate targets, single-event attribution analysis, and other forms of science activism have completely changed the way researchers, advocates, and journalists talk about the problem of climate change. The United Nations and hundreds of governments have endorsed atmospheric temperature targets. Friederieke Otto was recently named a contributing lead author on the IPCC’s upcoming Seventh Assessment Report. And the camp of “climate deniers,” a term once reserved for those who, through ignorance or malice, rejected the mainstream science of the greenhouse effect, has come to describe opponents of the Green New Deal and advocates of nuclear energy, which climate activists claim is too speculative for the certain policy prescriptions made by climate science.
But while activists may have taken over climate science, their big-picture results are not flattering. The vast bulk of the global energy supply still runs on fossil fuels. Carbon emissions continue to rise. The Trump Administration is systematically dismantling the climate policy apparatus that has been built up over the last generation, and it is not alone — governments around the world are rolling back climate commitments. There’s little uproar, because climate change remains a low priority for the public. And trust in climate scientists, and in science itself, shows signs of decline.
Today, the plausibly existential threat of artificial intelligence and the definitively existential threat of nuclear weapons seem likely to displace climate change in elite and popular attention. Climate science, for whatever it has accomplished during the past generation, may recede substantially, returning to a status comparable to the time before Hansen’s 1988 testimony.
That’s not necessarily a bad thing. Trust in scientists, after all, was higher before climate change became a globally recognized problem. The global rate of decarbonization was also faster before Hansen’s testimony than after it. Hydroelectric and nuclear power grew quickly in the postwar decades before the globalized environmental movement turned on these two technologies just as it was about to set its sights on a global climate treaty.
But past is not prologue, and climate scientists today lack other favorable conditions that prior generations enjoyed. Modern society has no shared infosphere. Declining confidence in science is tracked by declining faith in all kinds of institutions, democracy among them. The converging geopolitical and cultural interests that marked the end of the Cold War have split open again.
Reconstituting climate science and policy will require contending with this fractious new informational and political gestalt. For a newly successful climate science to take root in the coming years and decades, we need a renaissance in uncertainty. Uncertainty is not a dirty word — in climate science or anywhere else. Indeed, it is better understood as a kind of epistemic bravery: an assertion that while scientists and policymakers can’t predict the future, a scientifically informed, democratic public is capable of navigating it.
But given the state of climate science and activism today, embracing uncertainty again is no small matter. Doing so will require climate scientists and advocates to get over the urge to draw lines in the sand, to blame certain natural disasters on specific oil companies, and to manipulate statistics for a putatively gullible public. It will require working with democracy, not undermining it by circumventing legislatures to achieve desired climate outcomes by judicial and regulatory fiat. And above all, it will require humility. Climate scientists and their champions in activism and politics have for too long demanded the public accept their baroque warming scenarios and their arbitrary atmospheric targets as the guideposts of modern society.
A successful climate science will ultimately be one that informs society, not one that bounds it.