Tag Archives: Climate Sensitivity

New Paleoclimate Paper: Longest detailed reconstruction plus possible bad news

You’ve probably already heard about his paper because everyone is all a tizzy about it. There is a fundamental complaint being made about one of the paper’s conclusions. I’ve been paying careful attention to what my colleagues are saying about that one aspect of the paper, and I get their point but I’m not sure if they are right. I’ll explain that later.

What everyone so far has almost entirely missed, though, is the actual point of the paper, and that is important and while I’m sure it could be improved with further work, this is good stuff and important.

One of the major contributions people like Michael Mann and Malcolm Hughes and others made some years ago now, to the understanding of climate science, was the extension of the paleoclimate record back far enough to truly contextualize current global temperatures, and to meld that record with more recent instrumental records. There have been other attempts to put together paleo records as well. But this latest attempt is the first time anyone has reconstructed the global average surface temperature (GAST) for about two million years.

This is roughly the same time period as what is known as “The Pleistocene” or, if we want to put this in cave man journalistic terms, “The Ice Age” or slightly more correct “The Ice Ages.” The latter terms are misleadning and muddy, so please lets not us them (by those terms we are in the “ice ages” right now, but not in “an ice age.” See how annoying that can get, and for no good reason?)

The Cenozoic, roughly 65 million years long, is a time of general cooling across the Earth, and the Pleistocene, the most recent sizable period of time, a part of the Cenozoic, is the period when extra cooling seems to have occurred, and during which, there are a couple of dozen or so swings between relatively warm (like we have now) and relatively cool, with some of those cool periods, the most recent half dozen or more, being really really cool, with the big giant glaciers covering Canada, etc. etc. You knew this already.

For this period, we have a pretty good climate reconstruction, or least one that has been slowly building, that uses two sources of data from sea cores. One is the so called “delta-18” curve, and the other is the foram reconstruction.

The delta-18 curve simply measures oxygen isotopes in sea water, indirectly, and uses this to estimate how many cubic gigantometers of the planet’s water is tied up in ice, mainly in glaciers. (Some help on that here.) Examination of this curve starting in the late ’60s, but mainly in the ’80s, confirmed an old and zany theory that the Earth’s climate is controlled by the orbital geometry as our planet goes around the sun. Eventually it would come to be understood that cycles in how exactly the sun hits the Earth, called Milankovitch cycles, are a factor (but not the only factor) in global climate, and that this effect was very weak in the distant past, got stronger about 2 million years ago, and then perhaps got stronger again more recently.

The foram reconstruction is a bit more difficult. Here, fossilized remains of communities of a significant part of what we call “plankton,” which had settled to the bottom of the sea, are interrogated to estimate the sea surface temperature at that location. Certain forams prefer certain temperatures.

The new research, by Carolyn Snynder, published in Nature as “Evolution of global temperature over the past two million years” uses, mostly, this foram data to estimate sea surface temperature, and then, uses this to extrapolate to GAST, for the last two million years.

The result looks like the graphic above.

That, right there, that graph, is cool. It puts the modern world in context, and provides a good look at the Pleistocene.

The graph is not perfectly accurate and it is hard to say how inaccurate it is, given the paucity of data.

I asked the author about the possible limitations of this reconstruction. She told me, “this reconstruction is only as good as the proxy reconstructions and other assumptions it is based on (it is useful to note that three different SST proxy methods were included in the dataset). The more reconstructions available, the better the GAST reconstruction can be. That is why a significant amount of this research was focused on quantifying how large that uncertainty may be so that it was reflected in the final GAST reconstruction.”

Only sea surface temperatures are used, and there are strong seasonal effects in how foram communities form and are deposited. But, any ancient reconstruction is going to have problems, and this appears to be an excellent result. Dr. Snyder told me, “one of the major challenges of creating a global average surface temperature is that the primary reconstructions available over the past 2 million years are of sea-surface temperature. Available terrestrial temperature reconstructions are too infrequent and limited in spatial distribution to be used for a global reconstruction at this time. Therefore, I scale the average sea-surface temperature for 60ºN-60ºS to global average surface temperature using results from climate model experiments from 9 climate models. The scaling factor is necessary to address the fact that land surfaces and the poles tend to have larger temperature changes than the oceans. That assumption also drives a large fraction of the overall estimated uncertainty in the final GAST reconstruction.”

One of the most useful applications of this kind of information is placing modern global average surface temperatures in context. You can do that by looking at the graph, but I went ahead and asked the author how she would characterize modern temperatures vis-a-vis this reconstruction. She told me, “this is what we can say from this reconstruction: The only time periods in the temperature reconstruction when the estimated most likely global temperature change from the past 5,000 years was greater than 1 degree Celsius was during the last interglacial period (around 120,000 years ago) and then not previously until 1.77 million years ago. However, that is a summary of the “most likely” GAST estimate, and the uncertainty ranges are large, especially farther back in time. Also, the GAST reconstruction is relative to average temperature over the past 5,000 years, not directly to preindustrial temperature (due to the resolution of the reconstructions used), and so that is why I am not able to word this as a comparison relative to present temperatures.”

Now, on to what turns out to be a highly controversial result.

Snyder claims, using part of the Abstract of the paper to represent her finding, ” A comparison of the new temperature reconstruction with radiative forcing from greenhouse gases estimates an Earth system sensitivity of 9 degrees Celsius (range 7 to 13 degrees Celsius, 95 per cent credible interval) change in global average surface temperature per doubling of atmospheric carbon dioxide over millennium timescales. This result suggests that stabilization at today’s greenhouse gas levels may already commit Earth to an eventual total warming of 5 degrees Celsius (range 3 to 7 degrees Celsius, 95 per cent credible interval) over the next few millennia as ice sheets, vegetation and atmospheric dust continue to respond to global warming.”

If you have been following the climate science literature, you may see that range as incredibly high. It isn’t actually as high as it looks, because most discussions of “climate sensitivity” refer to the metric “Equilibrium Climate Sensitivity” which is both shorter term and lower (and different in other ways) compared to Earth System Sensitivity. The difference may be as much as 100%. The ECS estimates are all of the map but, nobody believes the number can be below 3 (except a few odd balls), most think 3.5 is a good estimate, but most say it could be as high as 6. So, doubling the ECS to get the ESS (which is probably not appropriate but hell, this blog post is not being submitted to peer review, so complain in the comments if you like) we get 7.0 – 12. In other words, saying that ECS is about 3-6, likley 3.5 and that ESS is 7-13, most likely 9, are not quite as dramatically different as they seem.

But still, the implication is that sensitivity is higher than people have been thinking. And this is where lots of people don’t like the research. Not because of the finding, but because of the effort to calculate this number itself. NASA scientist Gavin Schmidt has been circumspectly tweeting about this for several days, and just wrote a blog post at Real Climate on it. He notes,

Nature published a great new reconstruction of global temperatures over the past 2 million years today. Snyder (2016) uses 61 temperature reconstructions from 59 globally diverse sediment cores and a correlation structure from model simulations of the last glacial maximum to estimate (with uncertainties) the history of global temperature back through the last few dozen ice ages cycles. There are multiple real things to discuss about this – the methodology, the relatively small number of cores being used (compared to what could have been analyzed), the age modeling etc. – and many interesting applications – constraints on polar amplification, the mid-Pleistocene transition, the duration and nature of previous interglacials – but unfortunately, the bulk of the attention will be paid to a specific (erroneous) claim about Earth System Sensitivity (ESS) that made it into the abstract and was the lead conclusion in the press release.

The paper claims that ESS is ~9ºC and that this implies that the long term committed warming from today’s CO2 levels is a further 3-7ºC. This is simply wrong.

That post is here, go read it.

Meanwhile, I’ll be happy to have a go at explaining this complaint as clearly as I can without using math or physics. I’ll use dogs.

When I leave my house, my dog is always looking out the living room window. When I come home, the dog is always by the front door. The distance between the living room window and the front door is ten meters. Therefore, I can estimate that the final net change in dog location, as a result of whatever movements my dog is making all day, is 10 meters.

Now, I’m going to try to model your dog and see how that goes. That’s where I run into problems. This forces me to generlize the problem by getting much more specific about what the actual dog is doing during any given point in time. I don’t know anyting about you, your dog, or your house, so I need a model that takes all the different factors into account, then I predict what your dog will do. Should be easy, right?

When dogs are left alone, they do several things. They sleep in a few locations, they look out various windows, they visit their food bowl to sniff at it a few times. But what exactly they do is different depending on if the time one leaves the home is in the evening or morning, because of light outside, changing the nature of the window visits. Was the toilet bowl left up or not? Is there a treat hiding under the couch? The complexity is enormous, and you really can’t say much about the movements of the dog all day. Then, when you get home, will your dog, or anyone’s dog, automatically go to the door to wait for you? What if your dog hates you? What if your dog has lousy hearing and you walk home quietly, as opposed to a dog with great hearing, and the owner drives a motorcycle?

While correlation between dog’s position at my house over time works well, and allows me to accurately characterize dog position over time as recorded for the past at my location, it does not take into account the fact that some variables may act differently than the correlation implies.

That’s the argument that Snyder can’t say what she says. GAST responds to dust, albedo, which relates to ice distribution and amount, and that is determined by various climatic factors, etc. etc. As long as everything is the same from glacial cycle to glacial cycle, more or less, we can use a set of glacial cycles to emperically estimate what happens for any other glacial cycle. But if they don’t, then forgetaboutit.

The dog actually did move NET 10 meters every day, no matter what happened in between. But, the correlation is either spurious or at least, underdetermined. The ESS estimate is about how things settle down in the end, not about what happens during a unique period of dramatic climate change.

And this is why the criticisms are both correct and incorrect. I thing the criticisms by Gavin Schmidt and others are not especially relevant or interesting when asking this question: “What is the ESS value controlling Pleistocene climate change (with changing CO2 and GAST) over the last 2 million years?” The answer to that question is 9 (but see Schmidt’s commentary, he has other issues). If you don’t like 9, do your own study, change the way you handle the data, get more data, get an arguably better GAST curve, get better CO2 estimates, and recompute.

However, the answer to the question, “What is the ESS or ECS value governing anthropogenic climate change at present, and over the next few decades, based on Snyder’s temperature curve?

Answer: The dog died. Or it ran out the door in the middle of the day. Or some other doggy metaphor indicating a dramatic gap between expectation and reality, because of humans.

Human impacts on the climate over the last two or more centuries, and especially over recent and upcoming decades, via land use changes and greenhouse gas release, are not the same as what happened during previous interglacials. So, really, while this new work places the entire question in historical context, perhaps it doesn’t actually answer the question, because the house we left the dog in is totally different than it ever has been before.

And the author seems to be saying something roughly along these lines. Snyder told me that she followed prior work that had already “… defined the correlation relationship between global temperature and greenhouse gas radiative forcing changes as ESS as a way to summarize patterns in the Earth’s past climate. This is a useful metric that summarizes a combination of interactive feedbacks in the climate system. This does include changes in longer timescale feedbacks, such as ice sheets, vegetation, and dust, within the ESS metric. It is not ECS, as that is defined as explicitly not including those changes as internal feedbacks, but rather as external forcing that need to be explicitly accounted for separately.” She went on to tell me that the ESS is “a useful reference as a way to summarize past relationships from the paleoclimate record. But again, it is a correlation observed in the past, not a test of causation.”

So, in essence, Snyder is only talking about the dog’s past repeated behavior. ESS, she says, “… is likely state dependent, and thus I focused on comparing my new estimate to estimates from previous research on the late Quaternary. I also was able to investigate the state dependence of the metric within the last 800,000 years and found that it was lower in deep glacial states. This is an interesting finding, as some people have assumed that the ESS metric would be higher at the glacial maxima (e.g., the LGM) than during interglacials. That is not what the data shows.”

Which brings me to a couple of other observations, which may be a bit esoteric but if you think about it, are really quite interesting. Remember above when I said that Milankovitch cycles have varied quite a bit in the past as to whether or not they had a big influence on climate? Snyder confirms or proposes that there was an increase in influence about half way through this time period. She also shows that the Pleistocene cooling continued only up to a certain point, then stopped. And, this research appears to elucidate the relationship between the Arctic and Antarctic, and the rest of the world, a phenomenon known as polar amplification, which is the increase in temperature change at the poles compared to the rest of the planet. Cribbed and reworded a bit from the abstract:

<li>Global temperature gradually cooled until roughly 1.2 million years ago and cooling then stalled until the present. </li>


<li>The cooling occurred, then stalled, before the increase in the maximum size of ice sheets around 0.9 million years ago, so global cooling is shown to be a precondition for a shift in Milankofitch effects to the more recent pattern of ~100,000 year cycles.</li>


<li>Over the past 800,000 years, polar amplification has been stable over time.</li>


<li>Global temperature and atmospheric greenhouse gas concentrations have been closely coupled across glacial cycles. </li>

This may be an evolving situation. I asked Carolyn Snyder a few questions about her research after reading Schmidt’s first post, but before his second was posted, and I do not know if she had read either at that time. So don’t take her quotes from above as addressing his questions. They are merely clarifying remarks. Also, frankly, I’ve not heard the opinions of any of my colleagues who are on the higher end of sensitivity estimates. They may show up tomorrow and get in Snyder’s trench.

Yes, it is true that climate change is controversial. But not whether or not climate change is human caused, real, or important. It is all those things. But within the field itself, the scientist are busy fighting it out, as well they should. I’ll keep you posted.

Evidence of high climate sensitivity

I’m not going to say anything about this research because I’ve not read the paper, but it looks important. If someone out there writes something up I’ll put a link here.

Here’s the deal. Climate sensitivity is, very oversimplified, how much the surface of the planet heats up as we add CO2 and other greenhouse gasses to the atmosphere. More specifically, equilibrium climate sensitivity is the number of degrees C the atmosphere at face height and the sea surface heat up with a doubling of CO2 from pre-industrial levels.

If our atmosphere had just nitrogen and CO2 and that’s it, the number would be fairly low, about 1.2 degrees C. But live would not exist here because there would be no water, so we would not be having this conversation. The fact that we are having this conversations suggests the existence of water vapor, which cranks up sensitivity quite a bit, because more CO2 means more heat means more water vapor. That is just one of a number of “positive” (read not good) feedbacks on climate sensitivity.

I’ve noted before that if you offer a group of informed climate scientist the chance to guess a single number for climate sensitivity, using the Free Beer method, is something like 3.0. Certainly not less than 2.0. But it could just possibly be much higher, like 6. The chances of climate sensitivity being 6 are small, and if it turned out to be, then we are truly Doomed. But here’s the thing. The upper range of possible values for this important number is what is sometimes called a “fat tail.” The chances are low, but not so low they can be ignored.

Here’s a picture of a fat tail.

uncertainty_sensitivity

Even a value of 4 or 5 would be bad, and the chances are not vanishingly small that this would be the value.

So, about the latest research.

Title: Long-term cloud change imprinted in seasonal cloud variation: More evidence of high climate sensitivity

Authors: Chengxing Zhai, Jonathan H. Jiang, Hui Su

Abstract: The large spread of model equilibrium climate sensitivity (ECS) is mainly caused by the differences in the simulated marine boundary layer cloud (MBLC) radiative feedback. We examine the variations of MBLC fraction in response to the changes of sea surface temperature (SST) at seasonal and centennial time scales for 27 climate models that participated in the Coupled Model Intercomparison Project phase 3 and phase 5. We find that the intermodel spread in the seasonal variation of MBLC fraction with SST is strongly correlated with the intermodel spread in the centennial MBLC fraction change per degree of SST warming and that both are well correlated with ECS. Seven models that are consistent with the observed seasonal variation of MBLC fraction with SST at a rate ?1.28?±?0.56%/K all have ECS higher than the multimodel mean of 3.3?K yielding an ensemble-mean ECS of 3.9?K and a standard deviation of 0.45?K.

Potential meaning: Ruh roh.

These results are not particularly unexpected. But one would hope that more research would show a lower number, because we really don’t want this to be a higher number.

See also: Future warming likely to be on high side of climate projections, analysis finds, which covers A Less Cloudy Future: The Role of Subtropical Subsidence in Climate Sensitivity, by John Fasullo and Kevin Trenberth. Science, 9 November 2012:

An observable constraint on climate sensitivity, based on variations in mid-tropospheric relative humidity (RH) and their impact on clouds, is proposed. We show that the tropics and subtropics are linked by teleconnections that induce seasonal RH variations that relate strongly to albedo (via clouds), and that this covariability is mimicked in a warming climate. A present-day analog for future trends is thus identified whereby the intensity of subtropical dry zones in models associated with the boreal monsoon is strongly linked to projected cloud trends, reflected solar radiation, and model sensitivity. Many models, particularly those with low climate sensitivity, fail to adequately resolve these teleconnections and hence are identifiably biased. Improving model fidelity in matching observed variations provides a viable path forward for better predicting future climate.

See also: A bit more sensitive, which discusses “Spread in model climate sensitivity traced to atmospheric convective mixing” by Stgeven Sherwood, Sandrine Bony, and Jean-Louis Dufrense, in Nature, January 2 2014.

Equilibrium climate sensitivity refers to the ultimate change in global mean temperature in response to a change in external forcing. Despite decades of research attempting to narrow uncertainties, equilibrium climate sensitivity estimates from climate models still span roughly 1.5 to 5 degrees Celsius for a doubling of atmospheric carbon dioxide concentration, precluding accurate projections of future climate. The spread arises largely from differences in the feedback from low clouds, for reasons not yet understood. Here we show that differences in the simulated strength of convective mixing between the lower and middle tropical troposphere explain about half of the variance in climate sensitivity estimated by 43 climate models. The apparent mechanism is that such mixing dehydrates the low-cloud layer at a rate that increases as the climate warms, and this rate of increase depends on the initial mixing strength, linking the mixing to cloud feedback. The mixing inferred from observations appears to be sufficiently strong to imply a climate sensitivity of more than 3 degrees for a doubling of carbon dioxide. This is significantly higher than the currently accepted lower bound of 1.5 degrees, thereby constraining model projections towards relatively severe future warming.

See also: Overlooked evidence – global warming may proceed faster than expected

Global Warming: Getting worse

I recently noted that there are reasons to think that the effects of human caused climate change are coming on faster than previously expected. Since I wrote that (in late January) even more evidence has come along, so I thought it was time for an update.

First a bit of perspective. Scientists have known for a very long time that the proportion of greenhouse gasses in the Earth’s atmosphere controls (along with other factors) overall surface and upper ocean heat balance. In particular, is has been understood that the release of fossil Carbon (in coal and petroleum) as CO2 would likely warm the Earth and change climate. The basic physics to understand and predict this have been in place for much longer than the vast majority of global warming that has actually happened. Unfortunately, a number of factors have slowed down the policy response, and the acceptance of this basic science by non scientists.

A very small factor, often cited by climate contrarians, is the consideration mainly during the 1960s and 1970s, that the Earth goes through major climate swings including the onset of ice ages, so we have to worry about both cooling and warming. This possibility was obviated around the time it was being discussed, though people then may not have fully realized it at the time, because as atmospheric CO2 concentrations increased beyond about 300ppm, from the pre-industrial average of around 250–280ppm (it is now at 400ppm), the possibility of a new Ice Age diminished to about zero. Another factor mitigating against urgency is the fact that the Earth’s surface temperatures have undergone a handful of “pauses” as the surface temperature has marched generally upwards. I’m not talking about the “Faux Pause” said to have happened during the last two decades, but earlier pauses, including one around the 1940s that was probably just a natural down swing that happened when there was not enough warming to swamp it. A second pause, shorter, happened after the eruption of Mount Pinatubo, in 1991.

Prior to recent anthropogenic global warming, the Earth’s surface temperature has squiggled up and down do to natural variability. Some of these squiggles were, at least reionally large enough to get names, such as the “Medieval Warm Period” (properly called the “Medieval Climate Anomaly”) and the “Little Ice Age.” When the planet’s temperature started going distinctly up at the beginning of the 20th century, these natural ups and downs, some larger and some smaller, caused by a number of different factors, eventually became imposed on a stronger upward signal. So, when we have a “downward” swing caused by natural variation, it is manifest not so much as a true downturn in surface temperatures, but rather, less of an upward swing. Since about a year and a half ago, we have seen very steady warming suggesting that a recent attenuation in how much temperatures go up is reversing. Most informed climate scientists expect 2015 and even 2016 to be years with many very warm months globally. So, the second factor (the first being the concern over the ice age as possibly) is natural variation in the Earth’s surface temperature. To reiterate, early natural swings in the surface temperature may have legitimately caused some scientists to wonder about how much greenhouse gas pollution changes things, but later natural variations have not; Scientists know that this natural variation is superimposed on an impressive long term upward increase in temperature of the Earth’s surface and the upper ocean. Which brings us to the third major factor delaying both non-scientists’ acceptance of the realities of global warming, and dangerous policy inaction: Denialism.

The recent relative attenuation of increase in surface temperatures, likely soon to be over, was not thought of by scientists as disproving climate models or suggesting a stoppage of warming. But it was claimed by those denying the science as evidence that global warming is not real and that the climate scientists have it all wrong. That is only one form of denialism, which also includes the idea that yes, warming is happening, but does not matter, or yes, it matters, but we can’t do anything about it, or yes, we could do something about it, but the Chinese will not act (there is little evidence of that by the way, they are acting) so we’re screwed anyway. Etc.

The slowdown in global warming is not real, but a decades-long slowdown in addressing global warming at the individual, corporate or business, and governmental levels is very real, and very meaningful. There is no doubt that had we started to act aggressively, say, back in the 1980s when any major hurdles for overall understanding of the reality of global warming were overcome, that we would be way ahead of where we are now in the effort to keep the Carbon in the ground by using clean energy. The precipitous drop we’ve seen in photovoltaic costs, increases in battery efficiency and drop in cost, the deployment of wind turbines, and so on, would have had a different history than they have in fact had, and almost certainly all of this would have occurred faster. Over the last 30 or 40 years we have spent considerable effort building new sources of energy, most of which have used fossil Carbon. If even half of that effort was spent on increasing efficiency and developing non fossil Carbon sources, we would not have reached an atmospheric concentration of CO2 of 400ppm in 2015. The effects of greenhouse gas pollution would be less today and we would not be heading so quickly towards certain disaster. Shame on the denialists for causing this to happen.

I should mention a fourth cause of inappropriate rejection of the science of climate change. This is actually an indirect effect of climate change itself. You all know about the Inhofe Snowball. A US Senator actually carried a snowball into the senate chamber, a snowball he said he made outside where there has been an atypical snowfall in Washington DC, and held it aloft as evidence that the scientists had it all wrong, and that global warming is a hoax. Over the last few years, we have seen a climatological pattern in the US which has kept winter snows away from the mountains of California, contributing significantly to a major drought there. The same climatological phenomenon has brought unusual winter storms to states along the Eastern Seaboard that usually get less snow (such as major snow storms in Atlanta two winters ago) and persistent unseasonal cold to the northeastern part of the US. This change in pattern is due to a shift in the behavior of the Polar jet stream, which in turn is almost certainly caused by anomalous very warm water in parts of the Pacific and the extreme amplification of anomalous warm conditions in the Arctic, relative to the rest of the planet. (The jury is still out as to the exact process, but no serious climate scientists working on this scientific problem, as far as I know, doubts it is an effect of greenhouse gas pollution). This blob of cold air resting over the seat of power of one of the more influential governments in the world fuels the absurd but apparently effective anti-science pro-fossil fuel activism among so many of our current elected officials.

Climate Sensitivity Is Not Low

The concept of “Climate Sensitivity” is embodied in two formulations that each address the same basic question: given an increase in CO2 in the atmosphere, how much will the Earth’s surface and upper ocean temperatures increase? The issue is more complex than I’ll address here, but here is the simple version. Often, “Climate sensitivity” is the amount of warming that will result from a doubling of atmospheric CO2 from pre-industrial levels. That increase in temperature would take a while to happen because of the way climate works. On a different planet, equilibrium would be reached faster or slower. Historically, the range of climate sensitivity values has run from as low as about 1.5 degrees C up to 6 degrees C.

The difficulty in estimating climate sensitivity is in the feedbacks, such as ice melt, changes in water vapor, etc. For the most part, feedbacks will increase temperature. Without feedbacks, climate sensitivity would be about 1.2 degrees C, but the feedbacks are strong, the climate system is complex, and the math is a bit higher level.

As time goes by, our understanding of climate sensitivity has become more refined, and it is probably true that most climate scientists who study this would settle on 3 degrees C as the best estimate, but with wide range around that. The lower end of the range, however, is not as great as the larger end of the range, and the upper end of the range probably has what is called a “fat tail.” This would mean that while 3 degrees C is the best guess, the probability of it being way higher, like 4 or 5, is perhaps one in ten. (This all depends on which model or scientist you query.) The point here is that while it might be 3, there is a non-trivial chance (one in ten is not small for an extreme event) that it would be a value that would be really bad for us.

Anyway, Dana Nuccitelli has a recent post in The Guardian that looks at climate sensitivity in relation to “The Single Study Syndrome.”

There have been a few recent studies using what’s called an “energy balance model” approach, combining simple climate models with recent observational data, concluding that climate sensitivity is on the low end of IPCC estimates. However, subsequent research has identified some potentially serious flaws in this approach.

These types of studies have nevertheless been the focus of disproportionate attention. For example, in recent testimony before the US House of Representatives Committee on Science, Space and Technology, contrarian climate scientist Judith Curry said,

Recent data and research supports the importance of natural climate variability and calls into question the conclusion that humans are the dominant cause of recent climate change: … Reduced estimates of the sensitivity of climate to carbon dioxide

Curry referenced just one paper (using the energy balance model approach) to support that argument – the very definition of single study syndrome …

…As Andrew Dessler told me,

There certainly is some evidence that climate sensitivity may be below 2°C. But if you look at all of the evidence, it’s hard to reconcile with such a low climate sensitivity. I think our best estimate is still around 3°C for doubled CO2.

So there is not new information suggesting a higher climate sensitivity, or a quicker realization of it, but there is a continuation of the consensus that the value is not low, despite efforts by so called luke-warmists or denialists to throw cold water on this hot topic.

Important Carbon Sink May Be Limited.

A study just out in Nature Geoscience suggests that one of the possible factors that may mitigate against global warming, the terrestrial sink, is limited in its ability to do so. The idea here is that as CO2 increases some biological activities at the Earth’s Surface increase and store some of the carbon in solid form as biomass. Essentially, the CO2 acts as plant fertilizer, and some of that Carbon is trapped in the detritus of that system, or in living tissue. This recent study suggests that this sink is smaller than previously suspected.

Terrestrial carbon storage is dependent on the availability of nitrogen for plant growth… Widespread phosphorus limitation in terrestrial ecosystems may also strongly regulate the global carbon cycle… Here we use global state-of-the-art coupled carbon–climate model projections of terrestrial net primary productivity and carbon storage from 1860–2100; estimates of annual new nutrient inputs from deposition, nitrogen fixation, and weathering; and estimates of carbon allocation and stoichiometry to evaluate how simulated CO2 fertilization effects could be constrained by nutrient availability. We find that the nutrients required for the projected increases in net primary productivity greatly exceed estimated nutrient supply rates, suggesting that projected productivity increases may be unrealistically high. … We conclude that potential effects of nutrient limitation must be considered in estimates of the terrestrial carbon sink strength through the twenty-first century.

Related, the Amazon carbon sink is also showing long term decline in its effectiveness.

Permafrost Feedback

From Andy Skuce writing at Skeptical Science:

We have good reason to be concerned about the potential for nasty climate feedbacks from thawing permafrost in the Arctic….research bring good news or bad? [From recent work on this topic we may conclude that] although the permafrost feedback is unlikely to cause abrupt climate change in the near future, the feedback is going to make climate change worse over the second half of this century and beyond. The emissions quantities are still uncertain, but the central estimate would be like adding an additional country with the unmitigated emissions the current size of the United States’ for at least the rest of the century. This will not cause a climate catastrophe by itself, but it will make preventing dangerous climate change that much more difficult. As if it wasn’t hard enough already.

Expect More Extreme Weather

Michael D. Lemonick at Climate Central writes:

disasters were happening long before humans started pumping heat-trapping greenhouse gases into the atmosphere, but global warming has tipped the odds in their favor. A devastating heat wave like the one that killed 35,000 people in Europe in 2003, for example, is now more than 10 times more likely than it used to be…. But that’s just a single event in a single place, which doesn’t say much about the world as a whole. A new analysis in Nature Climate Change, however, takes a much broader view. About 18 percent of heavy precipitation events worldwide and 75 percent of hot temperature extremes — defined as events that come only once in every thousand days, on average — can already be attributed to human activity, says the study. And as the world continues to warm, the frequency of those events is expected to double by 2100.

Melting Glaciers Are Melting

This topic would require an entire blog post in itself. I’ll give just an overview here. Over the last year or so, scientists have realized that more of the Antarctic glaciers are melting more than previously thought, and a few big chunks of ice have actually floated away or become less stable. There is more fresh water flowing from glacial melt into the Gulf of Alaska than previously thought. Related to this, as well as changes in currents and increasing sea temperatures, sea level rise is sparking sharply.

The Shifting Climate

I mentioned earlier that the general upward trend of surface temperature has a certain amount of natural variation superimposed over it. Recent work strongly suggests that a multi-decade long variation, an up and down squiggle, which has been mostly in the down phase over recent years, is about to turn into an upward squiggle. This is a pretty convincing study that underscored the currently observed month by month warming, which has been going on for over a year now. It is not clear that the current acceleration in warming is the beginning of this long term change … that will be known only after a few years has gone by. But it is important to remember that nothing new has to happen, no new scientific finding has to occur, for us to understand right now that the upward march of global surface temperatures is going to be greater on average than the last decade or so has suggested. We have been warming all along, but lately much of that warming has been in the oceans. Expect surface temperatures to catch up soon.

Mann's False Hope Graphic Presentified

I needed a copy of the “False Hope Graph” that Michael Mann painstakingly created for his Scientific American piece “Earth Will Cross the Climate Danger Threshold by 2036” for a presentation I’m doing, but it had to be simpler, leave some stuff off, and be readable across the room on a screen. The original graphic looks like this:
earth-will-cross-the-climate-danger-threshold-by-2036_large (1)

It is a major contribution showing the relationship between climate sensitivity and climate change in the future depending on various important factors. The graphic I made from it is here (click on it to get the big giant version):

MannGraphic_2014

You’ll notice I left only one sensitivity + aerosol forcing line on it because in my talk I’ll use that as the most likely. Some of you might find it helpful.

2036 and Climate Change

After 16 minutes, Michael Mann on climate change, climate sensitivity, etc.

Why does Joan of Arc look so worried? The fire hasn't even touched her!
Why does Joan of Arc Being look so worried? The fire hasn’t even touched her!
Mann uses the analogy of a person jumping (or being thrown?) off a tall building, and as he passes the third floor notes that everything is fine. Another analogy that might be helpful is being burned at the stake. After they tie you to the stake and pile up the wood, you’re fine. Then they light the wood on fire and you’re still fine. For a while.

The climate sensitivity graph above is from here.

Unsure of Climate Science's Predictions? Do it yourself!

Well before mid century we will probably pass a threshold beyond which we’ll really regret having not curtailed the release of fossil Carbon into the atmosphere in the form of Carbon Dioxide. The best case scenario for “business as usual” release of the greenhouse gas is that some of the carbon, or some of the heat (from sunlight) gets taken out of the main arena (the atmosphere and sea surface) and buried or reflected somewhere for a while, and this all happens on a slightly delayed time scale.

The reason we know this is a little thing called science. And, more exactly, physics. And physics is math embedded in reality (or reality draped on math if you like), so there’s also math. And here is the formula:

the formula
the formula

For instructions as to how to use this formula to understand the statements in the first paragraph of this post, including the data you need to do the calculations, visit this new item on Scientific American’s web site, Why Global Warming Will Cross a Dangerous Threshold in 2036, by climate scientist Michael Mann. He’s also got an article in the print edition of Scientific American, which I’ve not seen because I let my subscription lapse.

A New Fake Report On Climate Change.

Who What When Where

Nic Lewis, an unaffiliated self described climate scientist, and a journalist, Marcel Crok, also unaffiliated, are known climate science denialists. The two of them have an objection to the International Panel on Climate Change (IPCC) conclusions regarding an important thing called “Climate Sensitivity.” Perhaps unable to get their work in the peer reviewed literature, the two of them wrote “a report” titled “OVERSENSITIVE: How the IPCC hid the good news on global warming,” that is available here. They make a claim which is totally incorrect but if it was correct it would be important. But it’s not. Either.

Imagine a Spherical Earth

Climate sensitivity is a term that refers to more than one thing, but the basic idea is this. If CO2 concentrations in the atmosphere were to double, how much would global surface temperatures rise? It is usually considered from a “baseline” of 280 parts per million (ppm), which is the pre-industrial level. We are currently at 400 ppm and we are heading for 560, the doubling, with little apparent serious effort (in my opinion) to curtail the rise. Climate sensitivity is expressed in degrees Celsius. So if some one says “climate sensitive is 2” than they mean that we can expect global surface temperatures to reach 2 degrees above baseline given 560 ppm of CO2 in the atmosphere.

Imagine a spherical earth. Imagine no water vapor in the atmosphere, and just to keep things simple, let’s have only land surface and no ocean. But the amount of air and its overall composition minus the water vapor is like our actual earth. On this imaginary earth, climate sensitivity is about 1.2. That’s apparently pretty easy to figure out because it is a matter of how CO2 operates as a greenhouse gas and how much energy the sun supplies, etc.

However, there could be negative and positive feedbacks that would make this work out differently. This would be things that either make some of the sun’s energy have less of an effect or more of an effect. Aerosols (dust) in the atmosphere, such as volcanic dust, can reflect sunlight away before it hits the earth’s surface, so it will have less of a contribution to heating the planet (which sunlight mainly does at the surface where it converts to infrared radiation). Ice and snow also reflect sunlight away (that’s called albedo). Water vapor in the atmosphere will generally act like a greenhouse gas and cause more heat by, to oversimplify a bit, interfering with the process of infrared heat leaving the atmosphere. Increased CO2 ultimately leads to more water vapor in the atmosphere, thus significantly amplifying warming. Warming can cause the release of methane into the atmosphere, another greenhouse gas, which in turn causes more warming until it oxidizes into CO2 and water. Water vapor can also get organized as clouds distributed in such a way as to add to albedo, reflecting away sunlight and decreasing warming.

With all these (and other) effects tugging this way and that on the temperature of the earth’s surface (by which we mean the atmosphere and the upper layer of the seas), how is one to figure out what actual climate sensitivity is?

Well, it is hard, and there has been a lot of work on it. There are papers coming out all the time on this topic. The IPCC spent a lot of effort on it. And, there are two answers to the question “what is the sensitivity of the climate?”

(Before giving you the answers, I want to point something out that is very important. The Earth’s surface does not warm up instantly as CO2 is added. It takes time. In fact, the changes that happen after CO2 is added to the atmosphere will continue for something like thousands of years. But the initial change, which involves the air heating up and weather systems changing and all that, would be observable over decades and reach a short term level of some stability in less time, measured in many decades or centuries. So there are two “climate sensitivities,” long term equilibrium and transient, the latter being what is generally talked about, with the idea of a mutli-decade time scale. So, the question we are asking is what will the earth be like at the end of the century, given a doubling of CO2 in the atmosphere?)

So, back to the answers. One answer is the simple answer, and it is 3. This is the number that climate scientists seem to settle on when you hold them down and say “shut up with the mumbo jumbo, just give me a number.” The other answer is about 1.5 to 4.5 but possibly higher at the higher end.

Some who wish to minimize the importance of climate change will say things like “1.5. That’s a small number, what are you worried about?” Those people are boneheaded idiots and they are hoping you are too. Is 1.5 a small number? A large number? It depends. If I take 1.5 pennies from you it is a small number. If I kill you 1.5 times, it is a large number. Suffice it to say that 1.5 is a big enough number that we should be worried about it. Also, it is a low ball estimate of climate sensitivity. Almost nobody believes it. By one reckoning, there is something like a 5% chance that the sensitivity is actually around 6. Holy crap. That would probably melt almost every single drop of glacial ice on the planet and the map of the United States would look like this, in a couple/few centuries:

Extreme_Sea_Level_Rise_Scenario

It would matter if there was a 20% chance that this is the map of the US your great grand children get to live with. They would actually have to remove stars from the US flag. If there is a US.

Below I supply a list of web pages you can check out to learn all about climate sensitivity.

But what about this report? Well, it’s a doozy. First, it has a forward extolling the virtues of Lewis and Crok. That’s nice. But the foreword is written by Climate Science Denialist Judith Curry. That does not bode well. Following this, the report is mainly a journey through a cherry orchard.

The adventures of Lewis and Crok

The report cherry picks a subset of scientific results that show lower sensitivity estimates and does a poor job of ruling out the other results that give higher estimates. They criticize the IPCC report, which summarized sensitivity studies, for leaving out the “good news” that climate sensitivity is actually very very low, by reporting a wide range of research indicating that it is not low. In other words, and I know this seems confusing but I think this is the point, Lewis and Crok are saying that the IPCC report is wrong because it reported all of the relevant scientific findings rather than just the ones Lewis and Crok would like to have seen noted.

DOES THE IPCC NOT KNOW ABOUT CHERRY PICKING YOU MAY ASK???

Sorry for shouting.

The authors suggest that the teams of scientists working on the IPCC report did not understand basic statistics, and that this contributed to their alleged overestimate of climate sensitivity. That part made me laugh.

Lewis and Crok put a lot of weight on what they term the observational record, which as you might guess if you have been following the denialist’s literature is one of the best places to pick cherries. Also, astonishingly and, really, laughably, they rely on Lewis’ prior publications suggesting low ball estimates of climate sensitivity. Yes, some guys have been pushing a particular scientifically difficult to support position; the world’s scientists in a major international effort produced a summary of countless hours of research and dozens of peer reviewed papers that disagree with those guys; those guys write a report about how what they’ve been saying all along, which differs with the established science, must be right because they’ve been saying it all along!

Yes, that’s about what this report amounts to. It’s a bunch of hooey.

For further reading on climate sensitivity I recommend the following:

NEW: GWPF optimism on climate sensitivity is ill-founded

“On Sensitivity” at Real Climate

“A Bit More Sensitive” on Real Climate

Climate-Change Deniers Must Stop Distorting the Evidence

How sensitive is our climate? at Skeptical Science


Other posts of interest:

Also of interest: In Search of Sungudogo: A novel of adventure and mystery, which is also an alternative history of the Skeptics Movement.

Matt Ridley Wrong, John Abraham Right

Recently, formerly respected writer Matt Ridley has been making a fool of himself with absurd and scientifically unsupported commentary on climate change. Recently he wrote something for the Wall Street Journal, “Dialing Back the Alarm on Climate Change,” that serves as an example of this.

Professor John Abraham has also provided an item for the Wall Street Journal that addresses Ridley’s goof. As Abraham puts it, “Matt Ridley states that a forthcoming major climate change report will lower the expected temperature rise we will experience in the future (“A Reprieve From Climate Doom,” Review, Sept. 14). He also claims that the temperature rise will be beneficial. I was an expert reviewer of the report.”

Read John’s full letter to the WSJ here. In it you’ll find the link to Ridley’s piece.