On the current high-emissions scenario (RCP 8.5), most of the tropical zone experiences many months each year of deadly heat, beyond the capacity of humans to survive in the outdoors. Source: Global risk of deadly heat |
Part 2 of 2 | Read Part 1.
by David Spratt
Intergovernmental Panel on Climate Change
The Intergovernmental Panel on Climate Change (IPCC) produces science synthesis reports for the primary purpose of informing policymaking, specifically that of the UNFCCC. This may be termed “regulatory science” (as opposed to “research science”), which Sheila Jasanoff describes as one that “straddles the dividing line between science and policy” (9) as scientists and regulators try to provide answers to policy-relevant questions. In this engagement between science and politics, say Kate Dooley and co-authors, “science is seen neither as an objective truth, nor as only driven by social interests, but as being co-produced through the interaction of natural and social orders”.
This coproduction has resulted in a number of characteristic features in the work of the IPCC, and is exhibited in the way the organisation was formed in 1988. There was tension between the desire of UN member states for political control of the panel’s outcomes, and the need to have credible scientists in charge of an expert process of synthesising and reporting on climate science. Some countries, including the United States, were concerned that “the ozone negotiations had allowed experts to get too far ahead of political realities; they wanted to retain closer control over the production of scientific knowledge by appointing the Panel’s members”.
The compromise was that scientists would write the long synthesis reports, but that the shorter Summary for Policymakers would be subject to line-by-line veto by diplomats at plenary sessions. Historically, this method has worked to substantially water down the scientific findings. As well, government representatives have final authority over all actions, including the publication of all reports, and the appointment of the lead scientific authors for all reports. The latter has contributed to the reticent nature of much of the IPCC’s key work.
As early as the IPCC’s first report, in 1990, the US, Saudi and Russian delegations acted in “watering down the sense of the alarm in the wording, beefing up the aura of uncertainty”, according to Jeremy Leggett. Martin Parry of the UK Met Office, co-chair of an IPCC working group at the time, exposed the arguments between scientists and political officials over the 2007 Summary for Policymakers: “Governments don”t like numbers, so some numbers were brushed out of it”.
Like the UNFCCC, the IPCC process suffers from all the dangers of consensus building in a complex arena. IPCC reports, of necessity, do not always contain the latest available information, and consensus building can lead to “least drama”, lowest-common-denominator outcomes, which overlook critical issues. This is particularly the case with the “fat-tails” of probability distributions—that is, the high-impact but lower-probability events for which scientific knowledge is more limited.
Climate-model limitations
From the beginning the IPCC derived its understanding of climate from general climate models (GCMs) to the exclusion of other sources of knowledge. This had far-reaching consequences, including the IPCC’s reticence on key issues and incapacity to handle risk issues. Dooley has noted that for more than two decades researchers:
questioned the policy-usefulness of GCMs because of their limitations in dealing with uncertainty. They argued that the dominance of models—widely perceived as the “best science” available for climate policy input—leads to a technocratic policy orientation, which tends to obscure political choices that deserve wider debate. There is now an established body of literature critiquing the implications of this expert-led modelling approach to climate policy.Dooley and co-authors say that research has now unmasked how this expert-led modelling approach to climate-policy politics gets built into science, enabling a technocratic and global framing of climate change, devoid of people and impacts.
There is a consistent pattern in the IPCC of presenting detailed, quantified (numerical) complex-modelling results, but then briefly noting more severe possibilities—such as feedbacks that the models do not account for—in a descriptive, non-quantified form. Sea levels, polar ice sheets and some carbon-cycle feedbacks are three examples. Because policymakers and the media are often drawn to headline numbers, this approach results in less attention being given to the most devastating, high-end, non-linear and difficult-to-quantify outcomes.
Twelve years ago, Oppenheimer and co-authors pointed out that consensus around numerical results can result in an understatement of the risks:
The emphasis on consensus in IPCC reports has put the spotlight on expected outcomes, which then become anchored via numerical estimates in the minds of policymakers…it is now equally important that policymakers understand the more extreme possibilities that consensus may exclude or downplay…given the anchoring that inevitably occurs around numerical values, the basis for quantitative uncertainty estimates provided must be broadened.Oppenheimer and co-authors said that comparable weight should be given to evidence from Earth’s climate history, to more observationally based, less complex semi-empirical models and theoretical evidence of poorly understood phenomena. And they urged the IPCC to fully include “judgments from expert elicitations”—that is, what leading scientific figures think is going on when there is not yet sufficient, or sufficiently consistent, evidence to pass the peer-review process. This has not been done.
Getting risk wrong
We have already seen the consequences of scientific reticence and consensus policy-making in the underestimation of risk. IPCC reports have underplayed high-end possibilities and failed to assess risks in a balanced manner. Stern said of the IPCC’s Fifth Assessment Report: “Essentially it reported on a body of literature that had systematically and grossly underestimated the risks [and costs] of unmanaged climate change”.
This is a particular concern with climate-system tipping points—passing critical thresholds that result in step changes in the climate system—such as the polar ice sheets (and hence sea levels), and permafrost and other carbon stores, where the impacts of global warming are non-linear and difficult to model with current scientific knowledge.
Integral to this approach is the issue of the lower-probability, high-impact fat-tail risks, in which the likelihood of very large impacts is actually greater than would be expected under typical statistical assumptions. The fat-tail risks to humanity, which the tipping points represent, justify strong precautionary management. If climate policymaking is to be soundly based, a reframing of scientific research within an existential risk-management framework is now urgently required.
But this is not even on the radar of the UN’s climate bodies, which have given no attention to fat-tail risk analysis, and whose method of synthesising a wide range of research with divergent results emphasises the more frequent findings that tend to be towards the middle of the range of results.
A prudent risk-management approach means a tough and objective look at the real risks to which we are exposed, especially those high-end events whose consequences may be damaging beyond quantification, and which human civilisation as we know it would be lucky to survive. It is important to understand the potential of, and to plan for, the worst that can happen (and, hopefully, be pleasantly surprised if it doesn’t). Focusing on the most likely outcomes, and ignoring the more extreme possibilities, may result in an unexpected catastrophic event that we could, and should, have seen coming.
An upside to non-linearity?
The problem with non-linear changes in the climate system is that, almost by definition and given the current state of development of climate models, they are difficult to forecast. While Earth’s climate history can give valuable insights into our near future, the IPCC has downplayed, often to the point of ignoring, these important research findings on the range of possible climate futures.
This field of paleoclimatology has revealed that in the longer run each 1°C of warming will result in 10 to 20 metres of sea-level rise and that the current level of greenhouse gases is sufficient to produce warming that would likely end human civilisation as we know it by the destruction of coastal cities and settlements, the inundation of the world’s food-growing river deltas, warming sufficient to make most of the world’s tropical zone uninhabitable (including most of South, Southeast and East Asia), and the breakdown of order within and between nations.
We have physical, non-linear climate-system disruptions coming very soon. But there are also social, economic and psychological tipping points that could trigger a much more rapid response to climate change. The rapid rise in popular support in the United States for the Green New Deal championed by the newly elected congresswoman from the Bronx, Alexandria Ocasio-Cortez, may be one such moment.
Big changes, says Schellnhuber, will require us to “identify a portfolio of options…disruptive innovations, self-amplifying innovations”. He says that these cannot be predicted precisely, so we need to look into whether there are high nonlinear potentials in a whole range of emerging technologies.
Such innovation is an important element for any whole-of-society, government-driven approach, such as the Green New Deal and its proposal for a wartime-level effort to build a zero-emissions economy, with an emphasis on jobs and justice.
Schellnhuber says the problem is the conventional economist who “will want to be efficient, but efficiency is the enemy of innovation”. Rapid innovation at a time of crisis, when time is short, means parallel innovation as you look for many answers at the same time, and some will fail, and capital will be wasted, but the successes are the “change we need”.
He says we must now muster climate-change venture capital at a global scale because “we cannot efficiently get ourselves out of this predicament”. This means:
We have to save the world but we have to save it in a muddled way, in a chaotic way, and also in a costly way. That is the bottom line, if you want to do it in an [economically] optimal way, you will fail.Such thinking would require a revolution in the neoliberal norms of the UNFCCC. Perhaps that is not possible and the body can no longer serve a useful role. Developing mechanisms that are more fit for purpose is now an urgent task.
This article was first published by Arena Magazine.