Foreword by Michel Quirke: In these Forum Reports, I will be doing my best to inform readers on who said what in Climate Change National Forum and why it matters. The CCNF scientists have published a lot of material in these last 7 months, so it might take around four Forum Reports to get caught up, but once we are up to date, I plan on publishing a new Forum Report every two-to-four weeks. OK, let’s get started with a recap on all that we’ve learned thus far in Climate Change National Forum!
Climate Change National Forum Begins!
Climate Change National Forum officially started on January 1st 2014. The initial CCNF Scientist Community at the time (it has since tripled in size) included Drs. John Nielsen-Gammon, Barry Lefer, Kerry Emanuel, Mauri Pelto, Jim Bouldin, Bart Verheggenan, and Ms. Liu Lulu.
POST 01: WHAT IS RESPONSIBLE SCIENCE COMMUNICATION? MS. LULU LIU SHARES HER INSIGHTS AND SETS A REFRESHING TONE FOR FUTURE DIALOGUE
Ms. Lulu Liu kicked off the national dialogue with a starkly honest and at times very personal contemplation on “the science communication problem” in her debut piece, Mind the Gap — Thoughts on the Scientific Communication Problem. Ms. Lulu Liu is a second year PhD student at the Harvard School of Engineering and Applied Sciences at Harvard University. In the piece, she shares a number of deep insights drawn from her own personal journey learning science (including her struggles early on) as well as covering a controversial scientific issue as an AAAS Mass Media Fellow working at a California newspaper in summer of 2010. This was at a time when “black oil was pouring out of the Macondo well in the Gulf of Mexico following the Deepwater Horizon explosion.” As the 24-7 news coverage of oil gushing from the blown out Maconda well abated somewhat, a public debate had sparked over whether or not oiled birds could be saved. Ms. Liu was tasked with finding the answer. “Opinions abounded,” she wrote, and a cursory investigation showed no consensus at all.
Scientist A said no, while Scientist B said yes. Somehow both sides had studies that seemed to support them, yet none of it got me any closer to understanding the truth. How, I wondered, can rational people disagree about facts?
The story took me a week to write. The first time I called the professor [professing that there was evidence the birds could be saved] , he pointed me to studies, regurgitated the same old points, and even had some colorful language for the other scientists. But a few days later, when I got him on the phone again to say what I’d learned about why the studies disagreed, he listened: I pointed out the different species of birds the different researchers sampled, the decades that separated them, their disparate methods of measurement, and more. After a pause, he said that yes, the rescued birds used to die–no matter what you did to help them–within days or weeks. Sometimes they would all die. I learned that survival depended on a bird’s species, size, and habits; how much oil it had ingested; whether its home was destroyed. I learned that in recent years the scientist’s university had poured millions into improving rescue methods, and there was evidence that this had paid off in decreased mortality rates. Now he had to defend his work, he said, as he feared a public outcry would spell the end of yet another thing that did good in the world. When I called the other scientists with what I learned, they agreed. It was not their intention to halt these efforts, they said; they had simply felt that the effectiveness .
Looking back, I understood why the scientists on both sides had exaggerated, bent the truth a little. They had come to regard the media with suspicion, as a threat. I also understood why journalists were happy to instigate. There was immense public interest. But as I watched readers hurl words like “heartless” and “wasteful” at each other in the comments, I realized that no understanding had been passed on.
That summer I learned that language matters. I learned that context matters, that a truthful narrative supported by facts is compelling on its own. And I learned that our inability to tell the story of science—of its goodness, its vision, its relentless truth-seeking—is eroding the public’s trust. Science is not quick or glamorous, and we don’t need to make it look that way. It’s the piecemeal assembly of reality, fact by painstaking fact, and that is beautiful enough. Every time the incremental is reported as revolutionary, a disservice is done. To sensationalize scientific progress is to misrepresent it. I think good judgment, and the will to exercise it, is the best quality a science writer can have. Because in a disagreement, I can trust such a writer to stand not squarely in the middle but as near as possible to the truth. And in science, there are no two versions of it.
I’d like to thank Ms. Liu for courageously contributing the Forum’s first piece and for establishing a refreshingly honest, clear, and inquisitive tone that has (for the most part) continued to this day. I also found her dogged search for the truth inspiring! [Side note: Are all AAAS Mass Media Fellows this tenacious? I actually looked into getting an AAAS Mass Media assigned to CCNF this summer, but then discovered that CCNF would be expected to pay the AAAS $10,000. So so much for that!)]
* * * *
POST 02: A NOTE ON THE “EXTREME WEATHER” IN THE CONTEXT OF GLOBAL WARMING FROM DR. JOHN NIELSEN-GAMMON
CCNF co-founder and board member Dr. John Nielsen-Gammon contributed the Forum’s second post, titled The Basics of Extreme Weather and Global Warming. Dr. Nielsen-Gammon is the Regents Professor of Atmospheric Sciences at Texas A&M University and the Texas State Climatologist (appointed by then-Gov. George W. Bush in 2000). In the piece, Dr. Nielsen-Gammon hones in on the meaning of “extreme weather.”
In order to lay the groundwork for future discussions of extreme weather events, I’d like to discuss the basic scientific expectations for changes in extreme events. What is extreme weather? What ought to happen with global warming?
WHAT IS EXTREME WEATHER?
Much confusion arises from the fact that “extreme weather” has two different meanings.
One meaning is that of “statistically extreme weather”, weather that happens extremely rarely. An example would be the record high temperature for a particular day and place. By definition, it has only happened once (except for ties) in the history of weather records, and so is extremely rare. For this definition to make sense, some reference period such as the “period of record” must be chosen.
The other meaning is that of “extremely dangerous weather”. An example would be a tornado. Though there are hundreds of tornadoes each year, each one is an extreme weather event, whether or not it happens to cause damage. Other examples are tropical cyclones, floods, and droughts. Sometimes the term “weather” is defined loosely enough to include wildfires.
There is overlap between these two types of extreme weather. For example, a record hot day in the summer is statistically extreme, and it can also be extremely dangerous. A record hot day in the winter is usually not so dangerous, unless it causes dangerous snow or ice melt.
WHAT OUGHT TO HAPPEN?
For statistically extreme weather, the answer, at least on the surface, is straightforward. If the average value of some weather variable (temperature, wind speed, etc.) has changed compared to a reference period, then one extreme ought to become more common while the other extreme becomes less common. Thus we expect (and see) more maximum temperature records and fewer minimum temperature records.
[…] Consider a location with a 100-year record of a stable climate. On average, a new daily high temperature record will be set 3.65 times a year and a new daily low temperature record will be set 3.65 times a year, for an average of 7.3 new records per year. Now suppose the temperature suddenly goes up five degrees. The next year, there might be 40 maximum temperature records, but probably zero minimum temperature records. One kind of record has become more frequent, while the other has become less frequent. But the total number of annual records has increased by 32.7!
It is also conceivable that the variability of weather might change. Maybe temperatures will become more erratic, or maybe less erratic, for example. Such a change would not be an obvious consequence of global warming, while an increase in the frequency of maximum temperature records by itself is an obvious consequence of global warming.
Changes in the frequency of extremely dangerous weather must be considered on a case by case basis. Very few changes are obvious. This is because most dangerous weather arises from various types of localized dynamic instability in the atmosphere, and it’s rarely clear whether the generation of atmospheric instability ought to speed up or slow down in a warming atmosphere, let alone whether other environmental changes will make that instability easier or harder to respond to.
A common oversimplification goes that “a warmer atmosphere has more energy, and therefore more energetic weather.” Scientists sixty years ago showed that this was wrong. Sure, a warmer atmosphere has more energy, but most of that energy can’t do anything but radiate away to space. It is the variation of energy (temperature) from place to place that drives the planetary winds. And in the Northern Hemisphere, as the difference in temperature between the equator and pole decreases, the circulation ought to become less energetic.
Perhaps the most obvious tendency for extremely dangerous weather is that the strength of the heaviest downpours ought to increase. This is because rainfall intensity is a product of the rate of upward motion of the air and the amount of water vapor it contains. Since warmer air can contain more water vapor, about 7% more for each degree Celsius of warming, a given updraft will tend to produce heavier rain. This effect might even be felt at the storm scale, more than making up for the decline of temperature variations.
Note that this is different from saying that there will be more rain on average. The general consensus is that there will indeed be more rain on average, but this conclusion is not obvious from simple principles. Sure there will be more water vapor in the air, but what if the air rises more slowly? Might that not compensate for the increased water vapor?
Also, “more rain on average” doesn’t mean “more rain on average everywhere”, it just means that the global average amount of rainfall (and snowfall) is expected to increase. It doesn’t even necessarily mean that more than 50% of the Earth’s surface will see increased rainfall. Along similar lines, increased drought frequency (another expectation) doesn’t mean that drought frequency will increase everywhere, nor even that drought frequency will increase over a majority of the globe.
See Dr. Nielsen-Gammon’s full essay here. There’s also a good follow-on discussion by the other scientists in the Scientists’ Comment Thread at the bottom of the post.
* * * *
FACT CHECKER HIGHLIGHT:
This early post in the “Climate Change in the Media: Commentary and fact checking by the CCNF Scientist Community” section (hereafter called the “fact checker/commentary section” or “fact checker” for short) presented this graphic and text published by the National Research Council in Climate Change: Lines of Evidence (National Research Council of the National Academies, Jun 27, 2012) for the scientists to fact check.
The material claims that:
The greenhouse effect is a natural phenomenon that is essential to keeping the Earth’s surface warm. Like a greenhouse window, greenhouse gases allow sunlight to enter and then prevent heat from leaving the atmosphere. These gases include carbon dioxide (CO2), methane (CH4), nitrous oxide (N2O), and water vapor. Human activities — especially burning fossil fuels — are increasing the concentrations of many of these gases, amplifying the natural greenhouse effect.
Comments by the CCNF Scientist Community:
Dr. John Nielsen-Gammon: “Close enough for government work, but not sufficiently correct to illustrate important related physical mechanisms such as cloud feedbacks. Scientists may prefer a more accurate description of the process, such as my Climate Abyss entry The Best Ever Description of the Atmospheric Greenhouse Gas Effect.”
[Here’s an excerpt from Dr. Nielsen-Gammon’s entry:]
The so-called atmospheric greenhouse effect (or Tyndall gas effect) can be most directly quantified as the difference between the globally- and time-averaged rate of electromagnetic radiation emitted by the surface of the Earth (~396 W/m2) and the rate by which radiation emitted from the combined Earth/atmosphere system escapes to space (~239 W/m2). These two numbers are different because certain constituents in the atmosphere (mostly water vapor, CO2, clouds, and other Tyndall gases) intercept most of the radiation emitted by the Earth’s surface.
The mere interception of radiation emitted by the Earth’s surface is not the whole story though, because anything that can absorb radiation at a particular wavelength can also emit radiation just as easily at that same wavelength (Kirchhoff). Also, the intensity of emitted radiation is proportional to the fourth power of temperature (Stefan-Boltzmann) (1). So if the atmosphere everywhere were somehow required to have the same temperature as the underlying surface of the Earth(2), it would be emitting to space exactly as much radiation as it intercepted from the Earth’s surface and the greenhouse effect would be zero. Instead, we observe(3) that the atmosphere is mostly colder than the Earth’s surface, by several tens of Kelvins, and this accounts for the relatively lower intensity of radiation escaping the top of the atmosphere compared to radiation entering the bottom of the atmosphere from below.
The total effect, including feedbacks, is not known very accurately and thus is presently a subject of intense research. […] Based on multiple lines of evidence, including past climate changes including glacial-interglacial cycles, recent large climate system perturbations due to volcanic eruptions, and simulations of the climate system with computer models, the total change in average global temperature due to an energy imbalance of 3.7 W/m2 is somewhere between 1.5 K and 4.5 K, with the most common estimates being around 2.5-3.0 K.
Dr. Bart Verheggen: “Eli Rabett has a relatively easy to understand explanation of the greenhouse effect. He refers to a more elaborate explanation by Chris Colose. In the documentary film Thin Ice, Ray Pierrehumbert explains the greenhouse effect in relatively simple terms.”
[Here’s an excerpt from Eli Rabbett’s explanation, as recommended by Dr. Verheggen:]
The Earth absorbs energy from the sun and reaches a steady temperature when it radiates the same average amount of energy to space. […] Each part of the Earth’s surface emits heat in the form of infrared (IR) radiation. The peak of this emission is right at the wavelength where CO2 absorbs strongly. While the proportion of CO2 in the atmosphere is small, 380 parts per million or 0.038% [this post was published in 2010, it’s now around 400 ppm or 0.040%], this is still a large number of molecules, large enough that near the surface, at wavelengths where CO2 absorbs, the average distance light will travel before being captured is a few meters (a couple of yards).
Greenhouse gases, as well as absorbing IR radiation, emit it. In just the same way the distance that the emitted radiation can travel is short near the surface, but increases as one climbs through the atmosphere because density, pressure and temperature decrease as we climb. […]
If we increase the proportion of CO2 in the atmosphere, the level at which energy can be radiated to space rises also, but since this higher level is colder and the pressure and density are lower, the doorway becomes narrower, and the surface has to warm more in order to shove the same amount of energy out and restore the balance with the incoming energy carried by the sunlight.
* * * *
POST 03: DR. MAURI PELTO, RECOUNTS A SURVEY TRIP TO MILK LAKE GLACIER, NOW DISAPPEARED, AND SEES A TREND ACROSS THE NATION
The Forum’s third piece, titled A Glacier That Did Not Survive, was contributed by Dr. Mauri Pelto–a glaciologist, professor of environmental science at Nichols College, and Director of the North Cascade Glacier Climate Project. In the debut piece, Dr. Pelto takes us back in time to the summer of 1992, where we join him on a hike to Milk Lake Glacier in the Glacier Peak Wilderness in Washington state. The glacier had long covered a substantial part of a large basin of mountains (as seen in the 1962 USGS aerial image he provided). Dr. Pelto’s first visit to Milk Lake Glacier was in 1988. “[At that time] the glacier had begun to break up in the lake basin, but the lake was still largely filled with ice bergs and fringing glacier.”
Dr. Pelto and his survey team were eager to check the glacier’s condition and identify any changes over the past four years, but a storm kept them confined to their tents near the access point to the ridge. When the storm finally relented, and with daylight waning, his team made the dash over the ridge to take measurements. What they saw shocked them. “To our surprise, the glacier pictured on the map was nearly gone… [T]he lake was ice free but the slopes on the west side of the lake still had glacier ice on them.” By 2009, he notes, “not even a small remnant of Milk Lake Glacier [would] remain.”
In the last paragraph, he identifies a culprit as well as an “unfortunate” trend:
What dooms a glacier is the lack of an accumulation zone (Pelto, 2010). This is the area of a glacier where snow persists year round, it is like the income for your bank account. Without a net income anywhere on the glacier, there are areas with net ablation (melting), hence these expenditures pay down the glacier ice bank account, until the glacier is lost. This is a disequilibrium response to new climate conditions. Unfortunately I have seen the same story play out on many glaciers in the United States.
* * * *
POST 04: DR. BART VERHEGGEN SPEAKS ON THE NEED FOR SCIENTISTS TO BE UP FRONT ABOUT THEIR VALUES WHEN COMMUNICATING CLIMATE SCIENCE
In the Forum’s fourth piece, Dr. Bart Verheggen—a lecturer at Amsterdam University College, founder of the blog Our Changing Climate, and CCNF’s first international scientist-member—explores the clash between speaking “just science” and communicating your “values” (which go beyond science) in science communication/science outreach settings. In the post, titled Schmidt & Curry on Science Advocacy, he emphasizes that scientists should be upfront about their values while engaging in science outreach projects, and shares his thoughts on a presentation (see video) by Dr. Gavin Schmidt (Director of the NASA Goddard Institute of Space Studies) at a fairly recent AGU meeting:
[Dr. Schmidt] argued that it’s best to be explicit about one’s values and clearly distinguish when one is talking values (“ought”) and when one is talking science (“is”). I entirely agree. I would add that it’s important to distinguish recommending a generic (e.g. mitigation) vs a specific (e.g. CCS) course of action, especially when the latter is not one’s area of expertise.
Dr. Verheggen then hones in on one key statement made by Dr. Thomas Stocker in an IPCC AR5 video: “Continued greenhouse gas emissions cause further climate change and constitute a multicentury commitment in the future. Therefore we conclude that limiting climate change requires substantial and sustained reductions in greenhouse gas emissions.” Dr. Verheggen asks the reader whether this is a normative statement (“ought”) or a factual statement (“is”)? He notes that Dr. Schmidt thought it was a factual statement and that Dr. Judith Curry, who was also at the AGU event and contributed some comments at the end of the presentation, thought it was a normative statement. He quotes Dr. Curry as mentioning that “there is a missing element in this argument [(implied in Dr. Stocker’s statement)] that warming is ‘bad’, which is a value judgment and has nothing to do with science.”
Dr. Verheggen disagreed, concluding that it was a factual statement, save only for the fact that the goal of reducing emissions was explicit. Thus, it was the equivalent of saying: “If this is the goal, then that is what needs to be done to achieve it,” he writes. He then goes on to criticize a list of what Dr. Curry called “examples of potential hidden values”.
Dr. Curry claimed these hidden influences (or at least their potential to influence a scientist’s position) are “why the public distrusts scientists as advocates.” [Note: Dr. Judith Curry is a professor and chair of the School of Earth and Atmospheric Sciences at the Georgia Institute of Technology and founder of the blog Climate Etc. She is also now a member of CCNF and has become a valued contributor to this national dialogue.] Dr. Verheggen writes, “It’s not at all clear that these would all go in the same direction of bias in favor of the mainstream (as Curry seems to implicitly assume).” Here’s the list of Dr. Curry’s “potential hidden values” (in bold) with Dr. Verheggen’s comments (in italics), as originally published by him:
- personal career advancement: Unclear in which direction this would most likely go.
- research funding: idem, though this could cause a tendency to increase the apparent magnitude of uncertainties.
- the value in terms of professional recognition (e.g. awards from professional societies) that supporting the scientific consensus can provide (recognizing the ostracism that con result from straying): No bigger reward for a scientist than to prove the scientific consensus wrong.
- media attention: This goes in the direction of providing relatively more media attention to contrarian voices, Judith Curry herself being a good example (assuming that she wasn’t as prevalent in the media before her U-turn away from mainstream science). This got confirmed in the large survey amongst climate scientists that I conducted last year (not yet published).
- influence within the scientific community: This hinges on using solid arguments, so usually provides the correct incentive.
- influence at the power tables in terms public policy: Like with media attention, extreme voices seem to have disproportionate influence. Look at the regular line-up for US senate hearings for example. If you crave media attention and political influence, being loudly contrarian is a sure way to achieve that. In the Netherlands the same tendency is apparent.
- broader political objectives that support any/all of the above: This goes more likely in the direction of downplaying rather than overplaying AGW I would argue.
* * * *
POST 05: DR. VERHEGGEN WEIGHS IN ON THE PUBLIC DEBATE ON CLIMATE CHANGE
In what is probably the most ambitious piece yet (in terms of a scope that goes beyond science), Dr. Verheggen explores the difference between the public debate on climate change and the scientific one in The Public Debate on Climate Change and, in an early demonstration of the range of discussion we hope to achieve in the future, attempts to articulate the social and political complexities of this “super wicked” issue. We’re starting from the top with this one:
The physical complexity of global climate is evident. However, resolving the issue of global warming is also socially and politically complex. It is not an isolated problem with an easy solution, since almost every human action requires energy and land use (and thus results in climate altering emissions). In contrast, it is an all pervasive issue: Both the effects of unlimited global warming, and the proposed solutions to mitigate these effects, are potentially far-reaching. Because of the strong societal impacts and the inherent uncertainty surrounding them, people’s perceptions of these effects differ wildly. These perceptions are influenced by their ethical, ideological, cultural and political beliefs, by the way they balance different risks and by their belief of what constitutes well-being. That is ‘why we disagree about climate change’, as Mike Hulme explains in his book with the same title.
This may explain why the public debate about global warming is so vastly different than the scientific debate: Whereas the former has its roots in a clash of worldviews, the latter is about natural science, things we can measure and/or theorize based on physical principles. In the public debate, opponents may debate whether emission reduction schemes would wreck the economy, or whether unlimited climate change would destroy the support systems of society. In the scientific debate, opponents may debate the extent to which mechanical instability could enhance the melting of polar ice sheets.
Besides the differences in problem perception and strong interdependencies that characterize the global warming debate, there are other confounding issues, which together have earned global warming the label ‘super wicked problem’: The climate system responds very sluggish to changes in emissions, due to inertia in the carbon cycle as well as the thermal inertia provided by the oceans. Therefore, preventive emission reduction measures (if desired) would need to be taken before the full extent of the consequences becomes apparent. This means that the longer we wait, the harder it will be to address the consequences of global warming, since by then we have committed ourselves to more warming. As such, those who caused the problem are in the best position to solve it, but since the full consequences will not materialize until much later, they have the least incentive to do so.
To illustrate what others have described as “locked in warming” (future warming that is inevitable), Dr. Verheggen provides an original metaphor that I found helpful to my understanding: “The inertia of the climate system could be compared to that of a supertanker: If we want to change its course, it’s important to start steering the wheel in the desired direction in time.”
[Question for Dr. Verheggen: Now, we still have a lot to cover regarding the science of climate change, but if CO2-induced global warming is very much a bad thing for our progeny and the biodiversity of their world as you seem to imply, then am I right that you are telling us that if we (humans) wait for things to get bad before we do anything, by then it will be too late?
That would make this a “super-wicked problem” indeed. Though presumably that first generation that gets screwed might have the ability to make it not so bad on their progeny, right?
I am also having a little difficulty understanding why there’s such a delay in the warming response? I get that the extra blanket of CO2 that we’ve thrown around the earth in the geologic blink of an eye traps heat, and yes, as Dr. Scott Denning deadpanned in his last post, “when we add heat to things, they change their temperatures.” But shouldn’t this happen fairly instantly? Why would it take fifty or hundreds of years for the temperature to reach “equilibrium warming” (the new stable temperature for that concentration of CO2, as I understand it)?]
Dr. Verheggen then touches on the dynamics of the public debate:
Advocates for stringent emission reductions may claim that transforming the global energy system is easy (whereas the mainstream position is that it is a tremendous challenge). Advocates for continuing emissions may claim that global warming will be advantageous for humanity (whereas the mainstream position is that the effects will be, on balance, negative). Motivated reasoning may play a role in this dynamic: When the solutions presented are perceived to conflict with one’s worldview, one is less likely to accept validity of the problem.
Echoing a point first made by Ms. Lulu Liu, Dr. Verheggen adds:
However, this does not mean the truth is in the middle between these (or any) two extreme positions. Reality is not influenced by what people think of it. It is quite possible, likely even, that people with certain predispositions are more prone to resisting a specific reality than others. Different ideologies and worldviews each have different blind spots as to which parts of reality are hard to accept.”
[Recall that in Mind the Gap, Ms. Lulu Liu emphasized the duty of a science writer to “stand not squarely in the middle but as near as possible to the truth, [because] in science, there are no two versions of it.”]
Dr. Verheggen then touches on “Climategate,” mistakes in the IPCC WGII, “hate speech,” and the general the polarization and politicization of climate change.
Some people started to vocally oppose the scientific consensus based on a variety of reasons (e.g. perceived errors or weak spots in the science, societal consequences which conflict with certain worldviews, psychological predisposition, cultural identity, vested interests, etc.). It also fits with a wider trend in society in which expert authority is not accepted as easily as it used to be.
Taken to the extreme, some people argue that the whole field of climate science is fraudulent or heavily biased. Some scientists, taken aback by the vocal display of distrust and in some cases even disgust, responded in ways that fueled rather than soothed the existing suspicions. This spiral of mutual animosity led to an increased polarization of views between proponents and opponents of the scientific consensus.
A climax in this polarization was reached after a large amount of emails from the Climate Research Unit (U.K.) were spread via the internet (dubbed “climategate” by those who are suspicious of the science). The outcry that followed was further exacerbated by some mistakes that were revealed in the IPCC report of working group II (on climate impacts). Even though these events are now a few years behind us, the polarization is still alive and kicking, and the internet is rife with what could be regarded as hate-speech towards “the other camp”.
So what is the way forward? The takeaway for the CCNF Scientist Community and our readers as we wade into this “super wicked” issue? Dr. Verheggen answers with the following observations and advice:
Because climate science seems to be taking centre stage in the public debate on global warming, it is important to be clear about what is known with high confidence and what is more uncertain. The first thing to note here is that science never proves anything (in the sense that mathematics does), because it is based on observations and inferences. However, a scientific conclusion can be strongly or weakly supported by the evidence, and such distinctions are important to make. The existing uncertainty in several aspects of climate science provides different actors in the public debate with an excuse to focus on unlikely values on their preferred side of the probability spectrum. Such an argument implicitly ignores that uncertainty usually cuts both ways (though not necessarily in a symmetrical manner).
[The “probability spectrum” that Dr. Verheggen is speaking of (also called the “climate change probability distribution”) has since become a regular topic of discussion in the Forum. This is the range of possible ‘climate sensitivity’ scenarios and their likelihood of occurring, often expressed through a graph like the one below, which Dr. Kerry Emanuel used in his recent post, Tail Risk vs. Alarmism:
Climate sensitivity is the ratio between a “forcing” (changes in radiation) and “response” (changes in temperature or other variables). At the risk of “making science simpler instead of simple” (ole Einstein said it should be the other way around), climate sensitivity is basically the answer to the million-dollar question of ‘How hot is it going to it?!?’ or, to be a little more accurate, ‘How much warming will X amount of CO2 ultimately cause?’
In a recent post titled Cause & Effect, Dr. Scott Denning showed us how he calculated climate sensitivity to be at 0.8 °C per W m-2. He does this through a “cause and effect” framework that looks at the climate sensitivity measurements and analyses from data compiled from major events like the 1991 Mount Pinatubo eruption, Medieval Warm Period to Little Ice Age transition, and end of the last BIG ice age, as well as stuff like the radiative properties of rocks on the moon. He arrives at this number without the use of climate models. His measurement translates to a 3 °C warming per doubling of CO2 compared to preindustrial times. This “per doubling of CO2” is the usual method by which scientists communicate climate sensitivity. (Also, just FYI, with current global CO2 atmospheric concentrations at 400 ppm, we are already close to this “doubling” point and will far exceed it under a business-as-usual scenario in the coming decades.)
The latest IPCC report concluded that the climate sensitivity range was between 1.5 to 4.5 °C warming per doubling of CO2. This is not as helpful as one would hope. I mean, 1.5 °C doesn’t seem to be so bad, but we’re utterly screwing future generations at 4.5 °C and above. Why such a wide range? According to Dr. Nielsen-Gammon, one reason is that in addition to emitting a lot of CO2, we humans have also been emitting a lot of particulate matter (mostly from coal power plants but other anthropogenic sources as well), and these particles have a partially compensating cooling effect, though they have a much shorter life in the atmosphere compared to CO2. And climate models cannot really help us out on the complex interactions between the particulate matter and clouds on the microscopic level, notes Dr. Nielsen-Gammon. (See video in Climate Change Discussion in Houston by Dr. Barry Lefer, starting at 1:00:50.) I would presume this is one reason why the precise sensitivity is still such an open question in science.
I should also note that none of the participating scientists have yet to openly challenge the IPCC’s range, other than noting that there still exists a “tail risk” of an even larger warming response (above 4.5 °C per doubling of CO2). This is the very unlikely but still plausible “worst-case scenario” that should obviously be taken into account in any risk assessment. In the above graph, the “tail risk” can be seen in the last 5% portion, which starts at 4.6 °C and goes on to about 6.5 °C at the extreme end.
The existence of this unlikely but still possible “tail risk” has not been a point of contention in the Forum, but the ability by which scientists can confidently quantify this “tail risk” in a probability distribution graph (or “pdf”) has been debated. In the Forum’s second-to-last post, titled Worst-Case Scenario vs. Fat Tail, Dr. Judith Curry criticized the method, pointing out that unlike the front-end range of the graph used by Dr. Emanuel (see above graph), quantifying the “tail risk” at the tail end comes with a lot more uncertainty. She observed the “fat [very wide] tail” at the far right (starting around 6 °C in the graph) and quipped that it “extends out to infinity of a mythical probability distribution.” Given such level of uncertainty on the tail end, but seemingly recognizing the existence of a tail risk, Dr. Curry argues that scientists are on more solid ground if they simply “identif[y] the possible/plausible worst case scenarios” rather than taking the “fat tail approach.”
This is about where the discussion has stopped in the Forum. Will be digging into this further in the future. Wanted to give readers a taste of the latest (at least those readers that have gotten this far!).]
Dr. Verheggen ends with this note on scientific consensus and the “social inertia” of the public’s acceptance of the science, which, compounded with the “climate inertia” explained above (see supertanker), will determine “how fast we can bend the global warming trend around.” He concludes:
If over time the evidence for a particular hypothesis accumulates and counterevidence is found wanting, a scientific consensus will naturally emerge. This is the normal course of scientific progress. Hypotheses that are not well supported by the body of evidence slowly fade into oblivion, while the strongly supported theory slowly becomes common (scientific) knowledge. There are several examples in the history of science where public acceptance of science lags behind the scientific acceptance of said theory, so this is not unique to climate science.