Science Magazine: “Outsider Finds Missing Global Warming” and NASA data sets “have underestimated the pace of warming between 1997 and 2012 by an estimated 158% and 49% respectively.” [Fact Checking]

May 5, 2014 5:25 am3 comments

Summary of initial comments (paraphrases and quotes):

Dr. Andrew Dessler: “I really liked this article and I learned a few things — such as the fact that Cowtan and Way are not climate scientists, but are contributors to Skeptical Science.” Will add one point: “While I accept their adjustment, this does not really explain the “hiatus”. I think it is quite clear that the hiatus represents a deficiency of our knowledge of internal variability.  That said, I doubt the hiatus poses an existential threat to the mainstream theory of climate and it will not have much of an impact on long-term (e.g., centennial) warming.  Nor do I think it indicates that climate sensitivity is substantially lower than present estimates.”

Dr. Scott Denning: “[T]he most important point in responding to this article is that the very small differences among the various global temperature data sets are “in the noise” with respect to understanding climate change. The fundamental reason for confident predictions of substantial warming in the 20th Century is the First Law of Thermodynamics. The underlying physical mechanisms for greenhouse warming are extremely well understood […] If you really want to learn about decadal changes (a.k.a. the “hiatus”), a better place to start is with England et al […] The key point in Cowtan and Way’s analysis is that some of the areas in which there are no weather stations (the Arctic) are places where warming trends are systematically stronger than average. This is hardly surprising and reflects well-understood phenomena of Arctic amplification. I don’t think we should consider the other data sets to be “wrong,” rather they are just less representative of truly global trends.” 

Dr. Jim Bouldin: “The key quote is Stefan Rahmstorf’s, who I know has a solid statistical understanding […] The fact that those who who work most directly with [these] data sets did not develop a correction method doesn’t inspire confidence. […] Knowing that Cowtan and Way are Skeptical Science contributors, a site whose objectivity w.r.t. the results I doubt, a good hard look at their methods is required.” See my comment on their methods in the Scientists’ Comment Thread below.

 – – – – – – – – – – – –

Science Magazine: “Climate Outsider Finds Missing Global Warming”

cowtan and way screenshot

In last month’s issue of Science magazine, reporter Eli Kintisch tells the story of how two outsiders to climate science, Kevin Cowtan and Robert Way, reanalyzed two prominent climate data sets and found the global warming trend in the past 16 years to be significantly underestimated. In other words, Cowtan and Way reportedly found that the heavily publicized “global warming hiatus” is not really a hiatus (which would imply a flat lining of global surface temperature or cooling), but rather, a recent slow down in the warming.

[Dr. Dessler: While I accept their adjustment, this does not really explain the “hiatus”. I think it is quite clear that the hiatus represents a deficiency of our knowledge of internal variability.  That said, I doubt the hiatus poses an existential threat to the mainstream theory of climate and it will not have much of an impact on long-term (e.g., centennial) warming.  Nor do I think it indicates that climate sensitivity is substantially lower than present estimates.]

[Dr. Scott DenningProbably the most important point in responding to this article is that the very small differences among the various global temperature data sets are “in the noise” with respect to understanding climate change. The fundamental reason for confident predictions of substantial warming in the 20th Century is the First Law of Thermodynamics. The underlying physical mechanisms for greenhouse warming are extremely well understood and firmly grounded in both laboratory spectroscopy and atmospheric observations. The physics is fully consistent with a huge range of radiation measurements both from the surface and from orbit, and also with paleoclimate data on a variety of time scales. 

Decadal changes in climate are fascinating phenomena well worthy of study, but they are NOT an important way to estimate climate sensitivity over a century or more in response to forcing of 3 to 8 Watts per square meter.

Between 1997 and 2012, CO2 increased from 364 ppm to 394 ppm, providing log(394/364)/log(2) * 3.7 = 0.42 Watts/m^2 of radiative forcing. 15 years is too short a time for the climate to adjust to the small increase in the forcing, but even at equilibrium if we allow for 3 degrees of warming per doubling of CO2, we would only expect 0.3 Celsius of warming response to 0.4 Watts/m2 of forcing.  Whether the amount of warming from 1997 to 2012 is 0.1 or 0.2 degrees Celsius has almost no bearing whatsoever on whether we expect 2 or 3 or 5 degrees of warming for a forcing 10 to 20 times as large. 

If you really want to learn about decadal changes (a.k.a. the “hiatus”), a better place to start is with England et al (2014, Nature, doi:10.1038/nclimate2106) who show that observed changes in winds over the tropical Pacific and the resulting change in ocean warming can explain the observed changes in decadal temperature trend.]

Titled ‘Climate Outsider Finds Missing Global Warming’, the news article states that Kevin Cowtan, a protein crystallographer at the University of York, and Robert Way, a graduate student at the University of Ottawa, have found the HadCRUT and NASA data sets — heavily relied on by climate scientists around the world — have “underestimated the pace of warming between 1997 and 2012 by an estimated 158% and 49% respectively.” [The fact that those who who work most directly with those data sets did not develop a correction method for this issue does not inspire confidence. — Dr. Bouldin]

Kintisch puts the difference in context and reports on the reactions of other climate scientists with the following excerpt:

That’s not a big difference in absolute temperature rise, says PIK climatologist Stefan Rahmstorf, given the relatively short period covered (see table). But it shows that the slowdown, or “hiatus,” has been less pronounced than previously described, he says. Correcting the data sets “should have been done ages ago,” Rahmstorf adds. In an e-mailed statement to Science, climatologists at the National Oceanic and Atmospheric Administration (NOAA) call the recent work “important contributions” to the climate field. 

[To me the key quote here is Stefan Rahmstorf’s, who I know has a solid statistical understanding. —Dr. Bouldin]

Temperature Trends_Cowtan and Way

Cowtan and Way discovered the underestimation (also described as a “cooling bias” by scientists) is due to the dearth of weather stations in the Arctic, which “satellites, weather models, and shrinking sea ice point to as the fastest warming [region] on the planet”, Kitisch writes. [The key point in the Cowtan and Way analysis is that some of the areas in which there are no weather stations (the Arctic) are places where warming trends are systematically stronger than average. This is hardly surprising and reflects well-understood phenomena of Arctic amplification. I don’t think we should consider the other data sets to be “wrong,” rather they are just less representative of truly global trends. – Dr. DenningHe adds that the two researchers, previously outsiders to the field of climate science, worked nights and weekends on their analysis, and raised funds for the work through the climate science advocacy website, SkepticalScience.com, which they contribute to. [Knowing that Cowtan and Way are Skeptical Science contributors, a site whose objectivity w.r.t. the results I doubt, a good hard look at their methods is required. See my comment in the Scientists’ Comment Thread below.– Dr. Bouldin]

Kitisch goes on to report that British and NASA scientists maintaining the data sets had “previously warned [that] the sparse coverage may lead to underestimates of global warming.” The NASA data set partially accounts for this data gap, he notes, by extrapolating temperatures for the missing areas using data from their edges, but the HadCRUT data set omits the missing regions entirely. Though it was not stated explicitly in the article, this presumably accounts for the larger discrepancy between Cowtan and Way’s findings and the HadCRUT data set maintained by British scientists.

In this short video published by the University of York, Cowtan explains his findings:

Kintisch describes the method that Cowtan and Way used to quantify the underestimates with the following:

They applied a new method that fills in missing temperatures over sea ice by combining satellite data for missing areas with a method known as kriging, which calculates missing data by checking nearby temperature station readings. Then they compared their method’s skill in interpolation with that of the NASA and Hadley approaches in the Arctic by systematically removing known data and using each of the three techniques to reconstruct the data set. Cowtan says his and Way’s approach proved the most reliable, and data from Arctic buoys and three weather models support it.

The takeaway? A slowdown in warming that’s half as big as previously thought.

[Dr. Denning:. The fundamental innovation of the Cowtan and Way analysis is the application of a well-established method for spatial interpolation of noisy data called “kriging” to estimate temperatures where there are no measurements. Kriging was originally developed in the mining industry in the 1950s for locating ore deposits using sparse samples, and has been very widely used in many fields where point data are used to estimate spatial patterns. 

Reconstructing temperature trends from direct station measurements has always been difficult because thermometer stations are sparse and their distribution is not optimal. Especially in the early part of the record the measurements were strongly concentrated near centers of population and wealth, which are hardly representative of the planet was a whole. More recent measurements are better distributed, but truly global satellite data provide such a short record that long-term trends will be unavailable for many decades.

Kritisch reports that a paper focusing on HardCRUT by Cowtan and Way was first published in the Quarterly Journal of the Royal Meteorological Society (see link for the abstract) in November 2013, and that Cowtan recently shared an unpublished result that “suggests a data-smoothing algorithm may be tuning down temperature measurements that NOOA’s Arctic climate stations supply to NASA” in an April presentation to a group of scientists and students at the Potsdam Institute for Climate Impact Research in Germany. Kritisch also mentions that scientists at the University of East Anglia, which runs HadCRUT jointly with the U.K. Met Office, and NOAA are planning on adding new stations in the Arctic in the near future to address the data gap.

Science magazine is published by the American Association for the Advancement of Science (AAAS). The article appears in the ‘News & Analysis’ section of the magazine’s April 2014 issue and is available here for subscribers.

THE FORUM'S COMMENT THREAD

  • Let’s keep this in perspective. The article we’re commenting on is only a “news” article, but the actual study to which the news article refers has already been the subject of a “good hard look” through the peer-review process at the Quarterly Journal of the Royal Meteorological Society (full PDF available at DOI:10.1002/qj.2297). Their review is likely to be considerably more rigorous than what we can accomplish here.

    As I said in my comment I think there’s no need to sensationalize this paper. The results are not particularly groundbreaking: they use well-established statistical methods to reconstruct temperature trends in unsampled regions where warming is faster than average, and lo-and-behold! the global mean trend increases.

    The so-called “hiatus” has been wildly oversold by the usual suspects, and this paper won’t change that, nor does it explain any physical mechanism (to better understand the hiatus you need to learn about the tropical Pacific: see England et al 2014 DOI: 10.1038/NCLIMATE2106).

    Nailing down the global temperature trend to a tenth of a degree per decade has virtually no bearing on projections of future climate change. The Cowtan and Way paper doesn’t overturn anything and shouldn’t be controversial.

  • To me, their primary innovation was the incorporation of satellite data into the estimation process. However, they use microwave soundings (MSU/AMSU) of the atmosphere (lower troposphere) for this purpose. They mention why they used the UAH sounding data over the
    RSS data, but I would like to have seen a broader discussion of the reasons for the choice of MSU soundings in the first place, over surface (“skin”) temperature estimates, as derived from say AVHRR,
    ASTER or MODIS, given that they are trying to adjust a surface temperature dataset, which is typically measured by screened instruments just a few feet above the ground. They mention AMSU “surface contamination” more than once and I’m not sure why they would use that term or what they mean by it exactly. Wouldn’t surface T
    estimates likely be better representations of the air T at typical thermometer height, than is the lower troposphere (and also make land vs sea comparisons more direct)? Are there major problems in estimating surface emissivity, making atmospheric corrections, or something else?

    For that matter, I’d like to see a discussion of this general topic in the IPCC AR5, but I don’t see one there.

  • For some reason my full comment got truncated. The truncated part read:

    “I’ve been more or less forced into commenting here because an original comment of mine was not included above with the others, for whatever reason. That comment read:…” (the above).

Leave a Reply

You must be logged in to post a comment.

PUBLIC COMMENT THREAD

  • http://biocycle.atmos.colostate.edu/ Scott Denning

    Let’s keep this in perspective. The article we’re commenting on is only a “news” article, but the actual study to which the news article refers has already been the subject of a “good hard look” through the peer-review process at the Quarterly Journal of the Royal Meteorological Society (full PDF available at DOI:10.1002/qj.2297). Their review is likely to be considerably more rigorous than what we can accomplish here.

    As I said in my comment I think there’s no need to sensationalize this paper. The results are not particularly groundbreaking: they use well-established statistical methods to reconstruct temperature trends in unsampled regions where warming is faster than average, and lo-and-behold! the global mean trend increases.

    The so-called “hiatus” has been wildly oversold by the usual suspects, and this paper won’t change that, nor does it explain any physical mechanism (to better understand the hiatus you need to learn about the tropical Pacific: see England et al 2014 DOI: 10.1038/NCLIMATE2106).

    Nailing down the global temperature trend to a tenth of a degree per decade has virtually no bearing on projections of future climate change. The Cowtan and Way paper doesn’t overturn anything and shouldn’t be controversial.

  • http://ecologicallyoriented.wordpress.com/ Jim Bouldin

    To me, their primary innovation was the incorporation of satellite data into the estimation process. However, they use microwave soundings (MSU/AMSU) of the atmosphere (lower troposphere) for this purpose. They mention why they used the UAH sounding data over the
    RSS data, but I would like to have seen a broader discussion of the reasons for the choice of MSU soundings in the first place, over surface (“skin”) temperature estimates, as derived from say AVHRR,
    ASTER or MODIS, given that they are trying to adjust a surface temperature dataset, which is typically measured by screened instruments just a few feet above the ground. They mention AMSU “surface contamination” more than once and I’m not sure why they would use that term or what they mean by it exactly. Wouldn’t surface T
    estimates likely be better representations of the air T at typical thermometer height, than is the lower troposphere (and also make land vs sea comparisons more direct)? Are there major problems in estimating surface emissivity, making atmospheric corrections, or something else?

    For that matter, I’d like to see a discussion of this general topic in the IPCC AR5, but I don’t see one there.

  • http://ecologicallyoriented.wordpress.com/ Jim Bouldin

    For some reason my full comment got truncated. The truncated part read:

    “I’ve been more or less forced into commenting here because an original comment of mine was not included above with the others, for whatever reason. That comment read:…” (the above).

  • Pingback: What Climate Scientists Are Saying About the Global Warming Hiatus

  • Pingback: Temperature adjustments: Dodds, Mosher and Venema cannot be happy | Shub Niggurath Climate