Tag Archives: Climate Change

Comparing models and empirical estimates of noise in the climate system

This is Part I of a two part treatment of new research on climate change. Part II is here.

There is a new paper out, Comparing the model-simulated global warming signal to observations using empirical estimates of unforced noise, by Patrick T. Brown, Wenhong Li, Eugene C. Cordero & Steven A. Mauget. It is potentially important for two reasons. One is because of what it says about how to interpret the available data on global warming caused by human generated greenhouse gas pollution. The other is because of the way in which the results are being interpreted, willfully or through misunderstanding, by climate science contrarians.

I will have a more detailed post on this in a few days, after I’ve gotten responses back from Patrick Brown, lead author, to a number of questions. For now I wanted to make a few preliminary remarks.

These to features … the part about how to interpret the record vs. the part about climate contrary commentary … are entirely unrelated. First a few comments about the science.

Science is often about measuring, describing, and explaining variation. There are usually two main sources of variation. One is variation in a natural system. The other is variation in measurement or some other aspect that amounts to error or noise. Some of that noise is actually part of the system, but not the part you are trying to study. In climate science, the error, the noise, an the part of the variation that is not of direct relevance to understanding climate change can all be thought of as “unforced variation.” Forced variation is the change in the Earth’s temperature (say, at the surface) caused by variation in the sun’s output, the effects of greenhouse gas, the cooling effects of aerosols (dust,etc.) and so on. Unforced variation is a large part of the cause of the wiggles we see in global temperature measurements over time.

The question is, when we see an uptick, or down tick, fast or slow or of any particular configuration in the march of global surface temperature over time, what does that variation mean? Does it mean that there is a response happening of the climate system to a forcing (green house gas, the sun, aerosols, etc.) or does it mean that there is random-esque noisy stuff going on?

Climate scientists have some standard ways of handling variation, and look very closely at these wiggles in temperature curves to try to understand them. This study, by Brown et al, takes a somewhat different (but not outrageously different) look at forced and unforced variation.

Here, I will parse out the abstract of the paper for you:

The comparison of observed global mean surface air temperature (GMT) change to the mean change simulated by climate models has received much public and scientific attention.

Models have predicted a certain set of upward curves of global temperatures (varying across different models or model assumptions). While the actual upward trend of surface temperatures has been as predicted overall, the actual curve is never right where the models say it will be. This is expected. Over the last decade or two, the actual curve has been lower than model predictions. The curve is at present (since early 2014) been shooting upwards and we may expect to see the actual curve move to the other side (above) the center of the predictions. This reflects expected and manageable differences between modeled projections and realty.

For a given global warming signal produced by a climate model ensemble, there exists an envelope of GMT values representing the range of possible unforced states of the climate system (the Envelope of Unforced Noise; EUN).

The models predict not just a central line but a range of values, an envelope. The envelope (with upper and lower bounds) is the noise around the central, meaningful projected change.

Typically, the EUN is derived from climate models themselves, but climate models might not accurately simulate the correct characteristics of unforced GMT variability.

This gets to the basic problem of variation. How do we measure and characterize it? Most people who do modeling that I’ve spoken to don’t think the models do a poor job of estimating the noise, and I think Brown et al do not think they do a bad job either. But the Brown et al takes a look at unforced variation (EUN) in the climate system from a different view, by looking at actual data to compare that with modeled data.

Here, we simulate a new, empirical, EUN that is based on instrumental and reconstructed surface temperature records.

So this new measure will produce the same sort of measurement previously used by climate scientists but using intrumental (thermometers and satellites) measurements for recent years and proxyindicators (like corals that indicate temperature changes over time, etc.) for longer periods, over the last 1,000 years.

We compare the forced GMT signal produced by climate models to observations while noting the range of GMT values provided by the empirical EUN. We find that the empirical EUN is wide enough so that the interdecadal variability in the rate of global warming over the 20th century does not necessarily require corresponding variability in the rate-of-increase of the forced signal. The empirical EUN also indicates that the reduced GMT warming over the past decade or so is still consistent with a middle emission scenario’s forced signal, but is likely inconsistent with the steepest emission scenario’s forced signal.

Brown et al found that more traditional methods do a good job of estimating the track of warming and its variation, but may underestimate the degree to which the global warming signal can wander in one direction or another (“wiggles”) when looking at temperature change over decades-long periods. Brown notes, “Our model shows these wiggles can be big enough that they could have accounted for a reasonable portion of the accelerated warming we experienced from 1975 to 2000, as well as the reduced rate in warming that occurred from 2002 to 2013.”

The global warming has this long term pattern, with warming from the 1910s-1940s, a hiatus from the 1940s-1970s and resumed warming from the 1970s-2000s. The question is, is this pattern (or other shorter term patterns) mostly the meaningful result of forced changes in climate (from greenhouse gasses and aerosols, mainly), or mainly random noise, or about even. Most climate scientists would probably say these longer term changes are mainly forced with a good dose of noise, while Brown et al might say that noise can explain more than they were thinking.

The main criticism I’ve heard from colleagues about Brown et al is that there is too much variation in the empirical data (especially the proxies), and that they may have increased the variation that seems to come from errors by compounding errors. I’m not sure if I agree with that or not. Still thinking about it.

So that’s the science. To me this is very interesting because, as a scientist, I’ve been especially interested in variation and how to observe and explain it. But in a sense, while potentially important, this paper is mostly a matter of dotting the i’s and crossing the t’s. Important i’s and t’s, to be sure. But from the outside looking in, from the point of view of the average person, this is not a change in how we think about global warming. It is a refinement.

But, Brown et al also looked at another thing. They looked at how the data behave when different starting assumptions for the nature of global warming, called the Representative Concentration Pathways, (RCPs). RCP’s are different projected scenarios of greenhouse gas forcings. Basically, a bunch of scientists put their heads together and came up with a set of different what-ifs, which differ from one another on the basis of how much greenhouse gas goes into the atmosphere and over what time period.

RCP’s are important because the amount of greenhouse gas that is released matters, and the timing of that release matters. As well, the presumed decrease in release and when that happens matters. It also matters that some greenhouse gas goes away, converts to some other greenhouse gas, etc. So the whole thing actually turns out to be mind numbingly complicated and numerically difficult to manage. The different RCP’s are actually big giant piles of pre-calculated numbers based on specified assumptions that climate modelers can download and use in their models.

Here is the bottom line with respect to RCPs. Brown et al, if correct, show that unforced variation — noise — will behave differently under different RCPs. Specifically,

We also find that recently observed GMT values, as well as trends, are near the lower bounds of the EUN for a forced signal corresponding to the RCP 8.5 emissions scenario but that observations are not inconsistent with a forced signal corresponding to the RCP 6.0 emissions scenario.

This is the part of Brown et al that will get the most criticism from climate scientists, and that is most abused by denialists. If you just look at the words, it looks like Brown et al are saying that RCP 8.5, the most extreme of the scenarios, is less likely than RCP 6.0. But they can’t say that. We don’t get to say which RCP is most likely. This, rather, is something we do. The different RCPs are scenarios of what humans (and volcanoes, I suppose) do, not how the climate responds to what we do. In short, Brown et al have noted that the interaction between the internal workings of climate change and what we see as the result (in terms of surface warming) results in a range of different patterns of the actual temperature wiggling around an ideal projected line. That is rather esoteric. It is useful. But it does not say that one or another scenario is likely or less likely … because it does not address that question. Brown et al also says nothing about the validity of climate models … they did not look at that. Rather, they have provided an insight, if their work holds up, on how random wanderings of reality from projections, in both directions (cooler than expected, warmer than expected) will emerge depending on what we do with our greenhouse gases.

If you drive faster, when you skid on the ice, you will skid farther. Doesn’t change where the road is.

Part II is here.

ADDED: As I suspect, more and more contrarian misuse of this work is happening. Even Rush Limbaugh has mis-quoted the research, as did the Daily Mail (though there, mainly in their headlines and bullet points … their actual “reporting” is mostly cut and paste from the press release, so that’s lazy journalists and incompetent mean spirited editors!). Anyway, you can read all about it here in excellent coverage by Media Matters.

How To Demolish Climate Denial

John Cook, of the University of Queensland, and his colleagues, have created a MOOC … Massive Open Online Course … called “Making Sense of Climate Science Denial.”

Why does this matter? How does it work? What can you do?

All of these questions are answered here: University offering free online course to demolish climate denial.

Fight sticky myths with even sticker sticklier facts.

Do go check it out. See you in class!


Check out: The First Earth Day, an epoch journey into politics, explosions, folk music, and old boats floating on stinking rivers.
___________________

An Improved Classification And Explanation For El Nino (New Research)

A new study seems to provide a better way to categorize El Nino climate events, and offers an explanation for how different kinds of El Nino events emerge.

El Nino is part of a large scale, very important climate phenomenon in the Pacific Ocean, generally referred to as the El Nino Southern Oscillation (ENSO). Over time (years) wind and water currents move heat into the upper levels of the Equatorial Pacific (La Nina). Then, over time (months) the heat comes back out – that is an El Nino. The effects can be dramatic. During El Nino years, trade winds and monsoons may behave differently than normal. How much precipitation falls and where it falls can change over large regions. Deserts become lakes, good croplands are drought stricken, sea levels change across large portions of the coast.

It is interesting to contemplate the following thought experiment (sorry, a bit of a digression). Imagine if all of the conditions associated with El Nino happened all the time, and had been happening for centuries. An El Nino that is always there is not really an El Nino. It is normal. Those parts that are dry would be dry; Plants and animals, including people, would be dry adapted there in physiology, ecology, and behavior. Same for wet places. It wouldn’t be a desert covered with a lake, it would just be a lake. It wouldn’t be a drought, but just a desert. Etc. The point of this is to underscore the real meaning of El Nino: change. It isn’t so much what it does, but rather, that this climate event’s effects are sudden, dramatic, and occasional.

El Nino has been off and on in the news over the last year or so because it looked like there was going to be a really big one in 2014, but it never materialized. (Even without an El Nino, which warms the surface of the Earth, 2014 was still a record breaking warm year.) Now, El Nino is in the news again because finally we kinda sorta are having one, and a future (this year and next) super double El Nino is being predicted.

Why is El Nino prediction so difficult, and why, when an El Nino happens, it may be very different from some other El Nino that happened before in its overall intensity and in the details of what it causes to happen elsewhere in the world?

You can hear them screaming. The climatologists. “Why? Why? WHY?!?!?” Because this is a really a big thing and it would be really nice to be better at predicting it.

A new study has taken an important step in understanding, and ultimately, predicting El Nino. “Strong influence of westerly wind bursts on El Nino diversity” by Chen et al, published in Nature Geoscience, makes two related points. First, the authors presupposed the existence of three kinds of El Ninos. It has long been thought that El Ninos can be classified into different categories, but the number and nature of those categories varies across groups of researchers. I asked the author if they tried using other a priori numbers for the El Nino categories. “Yes, we did try using other cluster numbers,” Dr. Chen told me. “If it’s set to 2, we would have the extreme El Nino and a broad cluster that include both the canonical and the warm-pool El Nino. If it’s set to 4, we would still see the 3 types we identified but with a 4th type that’s not well separated from the canonical El Nino. In any case we had only one type of La Nina. These discussions will be included in a long paper to be submitted to Journal of Climate.”

Chen et al used a method of modeling El Nino that is different than what is usually done and with this method successfully classified all of the El Nino events over a 50 year time period into these three categories. Second, Chen et al show that the main variable that determines what kind of El Nino happens is the intensity and location of westerly wind bursts (WWBs). I also asked if other variables used in their model (discussed below) were changed to see what would happen. Dr. Chen noted, “we did play with different model settings and parameters, and the outcome turned out to be fairly robust. We are very confident with our results.”

First a brief note on the method. The usual way of managing the complex phenomenon of El Nino … of measuring stuff and stuffing the measurements into a mathematical model … is called empirical orthogonal function (EOF) analysis. This involves measuring key variables across a grid covering the Pacific Equatorial region. Then you take the measurements and simplify how they are organized and turn a multidimensional time-space problem in to a one with fewer dimensions. There are different ways to do this but they all fall into the widely used methodology that included principle component analysis and other things you may be familiar with. You take something really complicated and derive simplified (somewhat) data that is more usable for characterizing a phenomenon or predicting the phenomenon’s behavior while at the same time not throwing away too much of the meaningful variation in the system. This method, however, if fairly linear and deterministic. A bunch of variables are thought to cause outcome A (which has variants), and this bunch of variables are combined so you have only X and Y causing A.

Chen et al applied a different (but well established) technique that presumes less about the linear nature of the model’s components and allows for complex interrelationships that may vary across conditions to remain. It is called fuzzy clustering method. In this method, the data are allowed to decide on their own (more or less) how they should be organized, and (this is the fuzzy part) individual bits of data are actually allowed to occupy more than one cluster. For many systems, the two methods would result in similar outcomes, but when a system is less linear the second method may be more realistic.

When this method is used, the role of WWBs turns out to be very important. This is not entirely new because we already knew that westerly winds across the Equatorial Pacific were important in ENSO cycles. The ENSO cycle involves, to simplify a bit, heat at the surface of the Pacific moving westward and then into the deep (but not to deep) ocean where it builds up. This process is maintained by currents and winds moving from east to west. It is a little like the air near your ceiling growing ever hotter if you burn wood in your stove; In that case the property of warm air rising causes the upper few feet of your living room to get much hotter than the floor. The Western Pacific gets hotter over time because winds and currents push the heat there.

This build up in heat (and other factors) eventually cause a change in the movement of heat and we see warm water moving east, surfacing, and transferring heat energy into the air. That is an El Nino.

From the abstract of the paper:

We propose a unified perspective on El Nin?o diversity as well as its causes, and support our view with a fuzzy clustering analysis and model experiments. Specifically, the interannual variability of sea surface temperatures in the tropical Pacific Ocean can generally be classified into three warm patterns and one cold pattern, which together constitute a canonical cycle of El Nin?o/ La Nin?a and its different flavours. Although the genesis of the canonical cycle can be readily explained by classic theories, we suggest that the asymmetry, irregularity and extremes of El Nin?o result from westerly wind bursts, a type of state-dependent atmospheric perturbation in the equatorial Pacific. Westerly wind bursts strongly affect El Nin?o but not La Nin?a because of their unidirectional nature. We conclude that properly accounting for the interplay between the canonical cycle and westerly wind bursts may improve El Nin?o prediction.

The authors demonstrate that accounting for WWBs does a better job of retroactively predicting the different kinds of El Nino events that have happened over the last fifty years. They conclude that El Nino may result from a combination of the built in see-saw effect of build up of ocean heat in the west and the reversal of movement of warm water on one hand and WWB perturbations, with the pattern of westerly winds affected by the oscillation itself. (I am over simplifying ENSO here, see below for resources on how it works.) Whether or not an El Nino happens is predicted by the classic oscillation model, but which kind of El Nino results is better predicted by the WWBs. From the study:

Such a scenario is appealing because it reconciles hotly debated issues related to the classification and genesis of various El Nin?o events, by killing three birds — diversity, asymmetry and extremes — with one stone. But one must not dwell on the simplicity of the picture painted here. Our intention is to emphasize the strong influence of WWBs on El Nin?o diversity, but not to downplay other processes that may play significant roles in El Nin?o dynamics and thus contribute to the complexity of its diversity.

The research reported here does not address, but may relate to, a set of questions that have been on my mind as I’ve watched El Nino and the discussion surrounding it develop over the last year or so. Is El Nino (or ENSO, more broadly) changing because of climate change? Since El Nino was already hard to predict, we can chalk up this last round of lousy predictions as El Nino being El Nino. But we might also ask the question, is it possible that as more surface and upper ocean heat enters the system, are there changes? Chen et all actually do note that “… real-time El Nin?o forecasting remains an elusive and formidable goal. This is probably because predictability estimates were mainly based on models dominated by a single mode of El Nin?o variability or on hindcast skills of relatively large El Nin?o events, whereas in reality El Nin?o has a variety of flavours, especially in the past decade” (emphasis added). So, these folks, referring to other research, note that El Nino has changed. Is this random variation with no important linear time dimension, or is it a “new normal” for the already normal-defying El Nino, or, perhaps, is it the first part of a period during which ENSO changes dramatically to a climate controlling phenomenon that acts differently in important ways?

Michael Tobis, an expert on atmospheric and ocean systems, suggested to me that “…the real action in climate change is where the warm water goes, not what the wind does. The wind will respond, and may reinforce or mitigate what the ocean does, true. But as the ocean water mass gets further from its recent near-equilibrium, eventually wind stress coupling becomes a smaller deal and the water will go where it will go.” Tobis also notes, and Chen et al acknowledge this may be important, that “the most salient feature in the oceans right now is the large and persistent warm blob in the eastern North Pacific.” This implies (i.e., causes me to speculate or, really, guess) that a warming ocean may shift the balance of what is important in driving, or resulting from, ENSO dynamics. That does not detract from Chen et al’s apparent ability to both classify and explain the differences between El Nino events with WWBs being the key factor.

I asked Dr. Chen to go out on a limb a bit to discuss what the future may hold as climate changes.

Question: With global warming, ocean heat (both at some depth and SST) has increased. Since heat in the ocean is a key variable in ENSO cycles, is it possible that El Nino dynamics would change in some important way, for example, of the three flavors of El Nino, the relative likelihood of which flavor manifesting changing? Is there evidence that such a change may have already occurred, thus the dismal level of predicability of 2014/5? Or, would you expect such a change in the future? My gut feeling is that El Nino dynamics is a barely stable metastable system that is in sufficiently weak equilibrium that it could change to a different equilibrium if important inputs are changed a lot.

Answer: I think your gut feeling is right, in the sense that El Nino is changing under global warming. The questions are how and why. Observations over the last 15 years seem to indicate that the system is now dominated by the warm-pool El Nino, while some people use IPCC model projections to argue that in the future the extreme El Nino will become more frequent. These are still open questions.

Second version of the same question: I’ve heard El Nino/ENSO described as a quasi equilibrium. The essential feature of this system is shifting back and forth between recharge and discharge of ocean heat. Is a different system imaginable where this is not a cyclic system, but rather, a steady state system (such as we see with the Atlantic Conveyor or other climate systems) with heat going in (somewhere) and coming out (somewhere else) more or less steadily? Since we are entering global temperature levels not seen in a long time (and thus only represented in ancient paleo records of lower quality) it seems like it can’t be ruled out (other than it being a rather extreme idea)

Answer: In the recent history and perhaps also during many periods in the past, ENSO did behave like a self-sustaining oscillation. However, it is quite possible that the system might enter a steady state — a permenant El NIno or La Nina state — when external forcing changes.

Question: Figure 4 (see top of post) seems to show something rather astonishing (aside from that figure’s use to demonstrate WWB and WWV association with different kinds of El Nino): WWB in 2014 was very high and uniquely so. Why? Other than the apparent fact that this WWB was not followed by a strong El Nino (a key point of your paper) is there anything else interesting about this?

It all depends on the interplay between the WWB and the basic cycle (measured by WWV). Only when the former occurs at the right phase of the latter a large El Nino will take place. It is true that WWBs were very strong in 2014, but only in the early year, not over the entire spring season. Further experiments will be needed to clarify whether or not the relationship between WWB and WWV will change under global warming.

It will certainly be interesting to see, over the next 24 months or so, if we end up having a strong El Nino, a double El Nino, or if we have lapsed into an extended period of what some are calling El Annoyingo.


For more information about El Nino:

Is a Powerful El Niño Brewing in the Pacific Ocean?

Fishing in pink waters: How scientists unraveled the El Niño mystery

El Niño/Southern Oscillation (ENSO) Technical Discussion

El Niño: How it works, how we observe it


Check out: The First Earth Day, an epoch journey into politics, explosions, folk music, and old boats floating on stinking rivers.

Finally, TV Meteorologists On Board with Climate Change

There was a time when I picked which local TV news station to watch based on the way the TV meteorologist addressed global warming. There were two stations in the running. One of them had a guy who frequently disparaged climate science, and the other had Paul Douglas, who no longer does TV meteorology (I no longer watch local TV news) but who has become a major spokesperson for reason and science (see: Paul Douglas on Climate Change and A Q&A with Paul Douglas, the evangelical Christian Republican poster boy for climate change). Paul and I have become colleagues and friends.

In 2011, George Mason University produced a survey of TV meteorologists demonstrating that more than 50% did not understand or accept that climate change was happening as a result of human caused greenhouse gas pollution. Just now, George Mason University has looked at this again and their results demonstrate a dramatic shift. Today, something close to 9 in 10 TV meteorologists in the US are on board with the science.

The dismal results of the 2011 survey resulted in the development of the Forecast the Facts project. Today, Forecast the Facts’ Deputy Director Emily Southard released the following statement:

Forecast the Facts is excited to learn that the number of meteorologists who accept that humans plays a role in climate change has increased from 50% to nearly 90% according to a recent GMU study. With viewers facing unprecedented climate-change induced heat waves, droughts, and flooding – it’s more important than ever that meteorologists, as some of the most trusted communicators on climate, accept the facts and present them to their audiences accordingly. We hope all meteorologists will follow suit and commit to broadcasting the truth on climate change.

The 2011 study is here (PDF), and this graph summarizes the results:
2011GMU_Study_Meteorlogists_Climate_Change

I’ve not seen the new study yet, just the press release, but if I get a copy of it I’ll post a link or show some pretty pictures or something …

ADDED: I’m still trying to get a copy of the report (there are technical problems at the site) but I did find this graphic summarizing it:

Screen Shot 2015-04-16 at 11.16.01 AM

Meanwhile, here is an interview I did with Paul Douglas a while back, demonstrating that TV meteorologists can have some very important things to say about climate change!

March 2015 Was A Very Warm Month

The last 12 months have been the warmest one year period in the NASA database since records began in 1880. According to the just released NASA GISS Global Temperature Data, March, 2015 is estimated to have been the fifth warmest month on record. Here are the top 20 months in rank order:

2007 JAN 93
2002 MAR 88
2010 MAR 87
1998 FEB 86
2015 MAR 84
2010 APR 82
2014 SEP 81
2015 FEB 78
2014 MAY 78
2014 OCT 77
2005 OCT 76
2015 JAN 75
2013 NOV 75
2010 NOV 75
1998 JUN 75
1995 FEB 75
2010 FEB 74
2006 DEC 74
2014 DEC 73
2014 AUG 73

Here is the monthly data covering the entire period of the instrumental record (1880 – present)

Screen Shot 2015-04-14 at 4.59.29 PM

And, most importantly, here is the 12 month running mean (showing only since 1940 to make it easier to read):

Screen Shot 2015-04-14 at 4.59.08 PM

And a couple more graphics showing the last few months up close:
Screen Shot 2015-04-15 at 12.37.31 AM

Screen Shot 2015-04-15 at 12.33.49 AM

And this is just looking at January through March for each year in the database:

Screen Shot 2015-04-15 at 9.47.47 AM

The rise of Skeptical Science

The site, not the thing. From the YouTube site:

Everyone at Skeptical Science spends a lot of their time reading the scientific literature and listening to experts. Without that we wouldn’t be able to write all the material that’s published on Skeptical Science. It’s a lot of work, especially when you do this with a critical eye. Our goal, after all, is to ensure that what we write reflects the scientific literature on the subject as accurately as possible.

The materials created by Skeptical Science are used by teachers, politicians, and of course by users on the internet to rebut climate myths. Thanks to this a lot of people have seen materials produced by us, even though they might not know that they have.

The website Skeptical Science wasn’t created overnight, nor was the team behind it assembled instantly. It started small with John Cook starting the website and publishing the first rebuttals to climate myths. As I wasn’t familiar with the story of how Skeptical Science evolved to the website it is today I had the idea to interview John about this. Despite John constantly saying “I’m just not that interesting” I eventually managed to get him in front of the camera to tell the story behind Skeptical Science.

The article released with this video can be found here:

The transcript, used resources, and citations for this video can be found here:

You can support me and the content that I create through Patreon.
https://www.patreon.com/collinmaessen

Interviews filmed in collaboration with University of Queensland, Skeptical Science, and Peter Sinclair. Full interviews available April 2015 in ‘Making Sense of Climate Science Denial‘.

With Global Warming, Will Cold Outbreaks Be Less Common?

Maybe, maybe not. There is a new paper that looks at what climate scientists call “synoptic midlatitude temperature variability” and the rest of us call “cold snaps” and “heat waves.” The term “synoptic” simply means over a reasonably large area like you might expect a cold snap or heat wave to be. Specifically, the paper (Physics of Changes in Synoptic Midlatitude Temperature Variability, by Tapio Schneider, Tobias Bischoff and Hanna Plotka, published in Journal of Climate) concludes that as human-caused greenhouse gas pollution increases, the frequency of cold snaps in the northern hemisphere will go down. Naturally, as temperatures warm up we would expect the highs to get higher, the averages to be higher, and the lows to be higher as well (and thus fewer cold spells). But the new research actually argues that the cold spells (the cold extremes at synoptic spacial scales) will become even less common. This is potentially controversial and conflicts with other recently published research.

The paper is rather technical so, I’ll give you the abstract so you can go take a class in climate science, then come back and read it:

This paper examines the physical processes controlling how synoptic midlatitude temperature variability near the surface changes with climate. Because synoptic temperature variability is primarily generated by advection, it can be related to mean potential temperature gradients and mixing lengths near the surface. Scaling arguments show that the reduction of meridional potential temperature gradients that accompanies polar amplification of global warming leads to a reduction of the synoptic temperature variance near the surface. This is confirmed in simulations of a wide range of climates with an idealized GCM. In comprehensive climate simulations (CMIP5), Arctic amplification of global warming similarly entails a large-scale reduction of the near-surface temperature variance in Northern Hemisphere midlatitudes, especially in winter. The probability density functions of synoptic near-surface temperature variations in midlatitudes are statistically indistinguishable from Gaussian, both in reanalysis data and in a range of climates simulated with idealized and comprehensive GCMs. This indicates that changes in mean values and variances suffice to account for changes even in extreme synoptic temperature variations. Taken together, the results indicate that Arctic amplification of global warming leads to even less frequent cold outbreaks in Northern Hemisphere winter than a shift toward a warmer mean climate implies by itself.

Why is this controversial? Because we have seen research in recent years indicating that with Arctic Amplification (the Arctic getting relatively warmer than the rest of the planet as global warming commences) the manner in which warm air is redistributed from sun-facing Equatorial regions towards the poles changes, which in turn changes the behavior of the Polar jet stream. Rather than being relatively straight as it rushes around the globe, separating temperate and sub-polar regions (and defining the boundaries of trade winds, and moving along storms) it is thought that the jet stream has become more often very curvy, forming what are called Rossby waves. These waves, recent research has suggested, can become stationary and the wind within the waves moves relatively slowly. A curvy jet stream forms meteorological features such as the “ridiculously resilient ridge” which has brought California nearly continuous dry conditions for at least two years now, resulting in an unprecedented drought. A curvy jet stream also forms meteorological features called “troughs” such as the excursion known last year (incorrectly) as the Polar Vortex, which also returned in less severe form this year; a bend in the jet stream that brings polar air farther south than usual, causing a synoptic cold spell of extensive duration. These changes in the jet stream also seem to have brought some unusual winter weather to the American Southeast last year, and have been implicated in steering Super Storm Sandy into the US Northeast a few years ago. And that flood in Boulder, and the flood in Calgary, and the June Of All Rain here in Minnesota last year, and so on. This is the main global warming caused change in weather systems responsible for what has been termed “Weather Whiplash” and may rank up there with increased sea surface temperatures as factors underlying the observable, day to day effects of human caused climate disruption.

I’ve talked about jet streams, Rossby waves, and such in a few places:

Even more recently was a paper by Dim Coumou, Jascha Lehmann, and Johanna Beckmann, “The weakening summer circulation in the Northern Hemisphere mid-latitudes” that argued:

Rapid warming in the Arctic could influence mid-latitude circulation by reducing the poleward temperature gradient. The largest changes are generally expected in autumn or winter but whether significant changes have occurred is debated. Here we report significant weakening of summer circulation detected in three key dynamical quantities: (i) the zonal-mean zonal wind, (ii) the eddy kinetic energy (EKE) and (iii) the amplitude of fast-moving Rossby waves. Weakening of the zonal wind is explained by a reduction in poleward temperature gradient. Changes in Rossby waves and EKE are consistent with regression analyses of climate model projections and changes over the seasonal cycle. Monthly heat extremes are associated with low EKE and thus the observed weakening might have contributed to more persistent heat waves in recent summers.

Coumou notes that “when the great air streams in the sky above us get disturbed by climate change, this can have severe effects on the ground. While you might expect reduced storm activity to be something good, it turns out that this reduction leads to a greater persistence of weather systems in the Northern hemisphere mid-latitudes. In summer, storms transport moist and cool air from the oceans to the continents bringing relief after periods of oppressive heat. Slack periods, in contrast, make warm weather conditions endure, resulting in the buildup of heat and drought.” Co-author Jascha Lehmann adds, “Unabated climate change will probably further weaken summer circulation patterns which could thus aggravate the risk of heat waves. Remarkably, climate simulations for the next decades, the CMIP5, show the same link that we found in observations. So the warm temperature extremes we’ve experienced in recent years might be just a beginning.”

These seem to be conflicting views.

So, how do the scientists who have published the recent paper that stands in stark contrast with these other recent findings explain the difference? I asked lead author Tapio Schneider to comment.

He told me that yes, there is a tension between the other work (the Comou et al paper) and his work, but there is also overlap and similarity. “Coumou et al. state that amplified warming of the Arctic should lead to reduced zonal jet speeds at fixed levels in the troposphere. This is an uncontroversial and well known consequence of thermal wind balance. Then they say that the reduced zonal jet speeds may lead to reductions in eddy kinetic energy (EKE), which is a measure of Rossby wave amplitude. That this can happen is likewise well documented. What affects eddy kinetic energies is a quantity known as the mean available potential energy (MAPE), which depends on temperature gradients (which also affect jet speeds) and other quantities, such as the vertical temperature stratification. Coumou et al. focus only on one factor influencing the EKE, the temperature gradient.”

The tension, he told me, is in what the other researchers (Coumou et al) draw from their results. “They show that warm summer months usually are associated with low EKE in the current climate, consistent with common knowledge: unusually warm conditions are associated with relatively stagnant air. They use this correlation in the current climate to suggest that reduced EKE in a future climate may also imply more (monthly) heat waves. While intuitive, this is not necessarily so. They say their suggestion is not in contradiction with our results because we considered temperature variability on shorter timescales (up to about two weeks), while their suggestion for more heat waves is made for monthly timescales. However, why the longer timescales should behave so differently is not made clear. “

As an onlooker, I take the following from this. First, there may be differences in time (and maybe space) scales of the analyses that might make them less comparable than ideal. Second, Schneider and Bischoff seem to be emphasizing synoptic cold outbreaks specifically. Schneider told me that they did look at temperature variability over longer time scales, but that did not make it into the paper. He said, “Even on monthly timescales, midlatitude temperature variance generally decreases as the climate warms, with a few regional exceptions (e.g., over Europe).”

Also, note that Schneider, Bischoff and Plotka, in this paper, do not address the specific problem of stationary Rossby waves, which probably has more to do with rainfall (lacking or heavy) than temperature, but is an important part of current changes in weather.

There has been some additional criticism of Schneider’s work on social media, etc. and perhaps the most significant one is this: Schneider, Bischoff and Plotka may have oversimplified the conditions in at least one of their models by leaving out continents. Also, Schneider et al has been picked up by a few of the usual suspects as saying that climate change will result in milder winters or less severe storms. This is not actually what the paper says. When people think “milder winter” they usually mean fewer severe storms, but various lines of evidence suggest that the notheastern US will experience more storms. For, example, see “Changes in U.S. East Coast Cyclone Dynamics with Climate Change” and “Global Warming Changing Weather in the US Northeast.”

UPDATE: I’ve received a comment from Dim Coumou pertaining to the differences between Schneider Et Al and Comou Et Al:

I see mostly overlap between the two studies, whereby ours really is an observational study analyzing data over the last 35 years (focusing on summers), and theirs is a theoretical and modeling study (focusing on winters). Interestingly both studies report similar dynamical changes but indeed come to somewhat different conclusions as to what this means for surface weather extremes.

We show that circulation (and notably EKE) has weakened in summer, that this has made weather more persistent and therefore favored the occurrence of prolonged heat waves in recent years. (We´re really focusing on present day climate and only show future projections for comparison with observations). As also discussed in our paper a drop in EKE leads to a reduction in weather variability on short timescales (less than a week) so this is consistent with the findings by Schneider et al. So indeed the issue of timescales is very important, and in my opinion prolonged extremes lasting several weeks are more important from an impact point of view.

Schneider, Bischoff and Plotka are well respected scientists and they are using methods that are generally accepted within climate science, yet have come to a conclusion different from what some of their colleagues have proposed. This is, in my opinion, a very good thing, and, certainly, interesting. I would worry if every climate scientist came up with the same result every time they tried something slightly different. The patterning (or changes in patterning) of air and sea currents under global warming has been the subject of a great deal of recent research, and there is strong evidence that changes are happening (such as in sea currents in the North Atlantic, and the jet stream effects discussed here) that have not been directly observed before. Because of the high level of internal (natural) variability, climate science works best when chunks of time 20 or 30 years long are considered. If we are seeing changes now that have really started to take off only five or ten years ago, and that are still dynamically reorganizing, how can the more ponderous, long term and large scale, thinking of climate science adjust and address those rapid changes? Well, we are seeing that process now in the climate change literature, and this paper is one example of it. I look forward to an honest, fair, and vigorous discussion in the peer reviewed literature.


Caption for the figure at the top of the post: FIG. 6. CMIP5 multimodel median values of 850-hPa potential temperature statistics for (left) DJF and (right) JJA. (a) Synoptic potential temperature variance u02 for the years 1980–99 of the historical simulations. (b) Per- centage change of the synoptic potential temperature variance u02 in the years 2080–99 of the RCP8.5 simulations relative to the years 1980–99 of the historical simulations shown in (a). (c) Percentage change of the squared meridional potential temperature gradient (›yu)2 in the years 2080–99 of the RCP8.5 simulations relative to the years 1980–99 of the historical simulations. (To calculate the gradients, mean potential temperatures were smoothed with a spherical harmonics filter that damped spherical wavenumbers greater than 6 and completely fil- tered out wavenumbers greater than 10.) (d) Percentage change of the squared mixing length L0 2 5 u0 2 /(›y u)2 implied by the variance and meridional potential temperature gradient, in the years 2080–99 of the RCP8.5 simulations relative to the years 1980–99 of the historical simulations. Synoptic potential temperature variations are bandpass filtered to 3–15 days. In the dark gray regions, topography extends above the mean 850-hPa isobar. The light gray bar blocks out the equatorial region, where potential temperature gradients are weak and their percentage changes become large.

How Sea Floor Ecosystems Are Damaged By, And Recover From, Abrupt Climate Change

A new study by Sarah Moffitt, Tessa Hill, Peter Roopnarine, and James Kennett (Response of seafloor ecosystems to abrubt global climate change) gets a handle on the effects of relatively rapid warming and associated Oxygen loss in the sea on invertebrate communities. The study looked at a recent warming event (the end of the last glacial) in order to understand the present warming event, which is the result of human-caused greenhouse gas pollution.

Here is what is unique about the study. A 30 foot deep core representing the time period from 3,400 to 16,100 year ago, was raised from a site in the pacific, and the researchers tried to identify and characterize all of the complex invertebrate remains in the core. That is not usually how it is done. Typically a limited number of species, and usually microscopic surface invertebrates (Foraminifera) only, are identified and counted. There are good reasons it is done that way. But the new study looks instead at non-single-celled invertebrates (i.e., clams and such) typically found at the bottom, not top, of the water column. This study identified over 5,400 fossils and trace fossils from Mollusca, Echinodermata, Arthropoda, and Annelida (clams, worms, etc.).

Complex invertebrates are important because of their high degree of connectivity in an ecosystem. In the sea, a clam, crab, or sea cucumber may be the canary in the proverbial coal mine. Study co-author Peter Roopnarine says, “The complexity and diversity of a community depends on how much energy is available. To truly understand the health of an ecosystem and the food webs within, we have to look at the simple and small as well as the complex. In this case, marine invertebrates give us a better understanding of the health of ecosystems as a whole.”

The most important finding of the study is this: the marine ecosystem sampled by this core underwent dramatic changes, including local extinctions, and took up to something like 1,000 years to recover from that. The amount of change in bottom ecosystems under these conditions was previously not well known, and the recovery rate was previously assumed to be much shorter, on the order of a century.

From the abstract of the paper:

Anthropogenic climate change is predicted to decrease oceanic oxygen (O2) concentrations, with potentially significant effects on marine ecosystems. Geologically recent episodes of abrupt climatic warming provide opportunities to assess the effects of changing oxygenation on marine communities. Thus far, this knowledge has been largely restricted to investigations using Foraminifera, with little being known about ecosystem-scale responses to abrupt, climate-forced deoxygenation. We here present high-resolution records based on the first comprehensive quantitative analysis, to our knowledge, of changes in marine metazoans … in response to the global warming associated with the last glacial to interglacial episode. The molluscan archive is dominated by extremophile taxa, including those containing endosymbiotic sulfur-oxidizing bacteria (Lucinoma aequizonatum) and those that graze on filamentous sulfur-oxidizing benthic bacterial mats (Alia permodesta). This record … demonstrates that seafloor invertebrate communities are subject to major turnover in response to relatively minor inferred changes in oxygenation (>1.5 to <0.5 mL·L?1 [O2]) associated with abrupt (<100 y) warming of the eastern Pacific. The biotic turnover and recovery events within the record expand known rates of marine biological recovery by an order of magnitude, from <100 to >1,000 y, and illustrate the crucial role of climate and oceanographic change in driving long-term successional changes in ocean ecosystems.

Lead author Sarah Moffitt, of the UC Davis Bodega Marine Laboratory and Coastal and Marine Sciences Institute notes, “In this study, we used the past to forecast the future. Tracing changes in marine biodiversity during historical episodes of warming and cooling tells us what might happen in years to come. We don’t want to hear that ecosystems need thousands of years to recover from disruption, but it’s critical that we understand the global need to combat modern climate impacts.”

There is a video:


Caption from the figure at the top of the post: Fig. 1. Core MV0811–15JC’s (SBB; 418 m water depth; 9.2 m core length; 34.37°N, 120.13°W) oxygen isotopic, foraminiferal, and metazoan deglacial record of the latest Quaternary. Timescale (ka) is in thousands of years before present, and major climatic events include the Last Glacial Maximum (LGM), the Bølling and Allerød (B/A), the Younger Dryas (YD), and the Holocene. (A) GISP2 ice core ?18O values (46). (B) Planktonic Foraminifera Globigerina bulloides ?18O values for core MV0811–15JC, which reflects both deglacial temperature changes in Eastern Pacific surface waters and changes in global ice volume. (C) Benthic foraminiferal density (individuals/cm3). (D) Relative frequency (%) of benthic Foraminifera with faunal oxygen-tolerance categories including oxic–mildly hypoxic (>1.5 mL·L?1 O2; N. labradorica, Quinqueloculina spp., Pyrgo spp.), intermediate hypoxia (1.5–0.5 mL·L?1 O2; Epistominella spp., Bolivina spp., Uvigerina spp.), and severe hypoxia (<0.5 mL·L?1 O2; N. stella, B. tumida) (19). (E) Log mollusc density (individuals/cm3). (F) Ophiuroids (brittle star) presence (presence = 1, absence = 0, 5-cm moving average). (G) Ostracod valve density (circles, valves/cm3) and 5-cm moving average.

Pine Beetle-Caused Forest Death, And Climate Change

There is some interesting new work carried out by researchers at Dartmouth College and the USDA Forest Service on the relationship between the Mountain Pine Beetle, major die-offs of forests in North America, and climate change.

The Mountain Pine Beetle (Dendroctonus ponderosae) is a kind of “bark beetle” (they don’t bark, they live in bark) native to western North America. They inhabit a very wide range of habitats and are found from British Columbia all the way south to Mexico. In British Columbia alone, the pine beetle, though a fairly complex process, has managed to destroy 16 of 55 million acres of forest. This epidemic of tree death is seen in mountain forest regions all across the western United States. The beetles affect a number of species of pine trees.

The beetle lays its eggs under the pine tree bark, and in so doing, introduces a fungus that penetrates adjoining wood. This fungus has the effect of suppressing the the tree’s response to the Pine Beetle’s larvae, which proceed to eat part of the tree. This suppressive effect blocks water and nutrient transport, together with the larvae eating part of the tree, quickly kills the host tree. The process can take just a few weeks. It takes longer for the tree to actually look dead (note the evergreen tree you cut and put in your living room for Christmas is dead the whole time it is looking nice and green and cheery). By the time the tree looks dead, i.e., the needles turn brown and fall off, it has been a dead-tree-standing for months and the Pine Beetles have moved on to find other victims.

It has long been thought that climate change has contributed to the western epidemic of Pine Beetles, as well as a similar epidemic in the Southeastern US (different species of beetles). The primary mechanism would be increasing winter extreme low temperatures. The very low temperatures would kill off the larvae, removing the threat of the beetle’s spread locally after that winter. Extreme winter temperatures have warmed by around 4 degrees C since 1960 across much of the beetle’s range. The lack of killing colds itself does not cause a beetle epidemic, but simply allows it, or produces a “demographic release.” If the beetles are already there, they have the opportunity to spread.

A recent study, just out, (see reference below) confirms this basic model but also adds a considerable degree of complexity. The study shows that there is not as strong of a correlation between raising winter temperatures above typical killing levels and the spread of the beetle. The study indicates that demographic release form an increase in extreme winter lows is part of the equation, but the situation is more complex and likely warming in general enhances beetle spread and reproduction during the summer part of its lifecycle, and may weaken the trees to make them more vulnerable to attack. In addition, other non-climate related factors probably play a role.

The study looked at several regions and assembled data on beetle frequency and spread over time, and various climate related data. From the abstract:

We used climate data to analyze the history of minimum air temperatures and reconstruct physio- logical effects of cold on D. ponderosae. We evaluated relations between winter temperatures and beetle abundance using aerial detection survey data… At the broadest scale, D. ponderosae population dynamics between 1997 and 2010 were unrelated to variation in minimum temperatures, but relations between cold and D. ponderosae dynamics varied among regions. In the 11 coldest ecoregions, lethal winter temperatures have become less frequent since the 1980s and beetle-caused tree mortality increased—consistent with the climatic release hypothesis. However, in the 12 warmer regions, recent epidemics cannot be attributed to warming winters because earlier winters were not cold enough to kill D. ponderosae…There has been pronounced warming of winter temperatures throughout the western US, and this has reduced previous constraints on D. ponderosae abundance in some regions. However, other considerations are necessary to understand the broad extent of recent D. ponderosae epidemics in the western US.

“This amount of warming could be the difference between pests surviving in areas that were historically unfavorable and could permit more severe and prolonged pest outbreaks in regions where historical outbreaks were halted by more frequent cold bouts,” says first author Aaron Weed, an ecologist at the National Park Service.

In the 11 coldest regions, winter temperatures cold enough to e lethal to D. ponderosae have become less frequent since the 1980s, and this is associated with an increase in tree mortality, confirming the link between warming conditions and increased parasite caused tree death. However, in the 12 regions with the warmest climate, recent epidemics are not clearly linked to warming winters simply because the earlier, colder, winters were already not cold enough to repress the tree-killing mountain pine beetle. This suggests that other factors may play a role in the epidemics in the western United States.

Evens so, the pattern of warming (including increase of minimum winter temperature) correlates to the demographic release of the mountain pine beetle. The authors note that “warming year-round temperatures that influence generation time and adult emergence synchrony … and drought effects that can weaken tree defenses …” are plausible explanations, but further note that a simple single explanation is not likely to be sufficient to explain the overall phenomenon. The link between warmer years, added number of generations per year, and the epidemic is explored here.

This is, in a sense, a numbers game. A cold winter does not kill off all of the beetles. However, no matter how cold the winter is, no beetles will be wiped out if they are not there to begin with. So, demographic release, which makes possible but does not cause an outbreak, could cause an abundance of beetles across a much larger area where, no matter what natural suppression may occur, they will then become more abundant over time.

As noted, the trees themselves matter. We can safely assume that generally changes in overall climate will mean that plant communities adapted to a given region might lose that adaptive edge and be subject to a number of problems which can then be exploited by a potentially spreading parasite. These changes in viability of plant communities are not all climate change related. Forest management, disturbance, and regional demographics (as forests age, they tend to change what they do) are also factors in this complex set of ecological relationships.

The bottom line. This study confirms the effects of warming, especially the increase of winter low temperatures, on the potential for D. ponderosae to spread rapidly locally and regionally. The study also calls into question the simplistic model that this is all that happens to explain the widespread epidemic of this beetle. Other factors, including other aspects of global warming, also contribute to the epidemics. In addition, and importantly, the study demonstrates a high degree of variability in the outcome of ecological and climate change.

This epidemic is probably the largest observed kill-off of forests caused by a parasite. So far it is much more severe in its effects than forest fires, but over the long to medium term, we will probably see increased frequency and severity of forest fires because of the abundance of fuel provided by the die-off.

Soucre:
Weed, A. S., Bentz, B. J., Ayres, M. P., & Holmes, T. P. (2015). Geographically variable response of Dendroctonus ponderosae to winter warming in the western United States. Landscape Ecology. doi:10.1007/s10980–015–0170-z

Text for the image at the top of the post, from the USDA:

The Mountain Pine Beetle is at epidemic levels throughout the western United States, including here in the Rocky Mountain Region … Forests affected here include several in Colorado, Wyoming, South Dakota and Nebraska. In northern Colorado and southeastern Wyoming, Mountain Pine Beetles have impacted more than 4 million acres since the first signs of outbreak in 1996. The majority of outbreaks have occurred in three forests: Arapaho-Roosevelt, White River and Medicine Bow/Routt.

Should the Smithsonian and Other Museums Blow Off Big Fossil?

Let me start off by saying something you may not know. The big corporations and the 1%ers you have learned to hate fund many of the projects you’ve learned to love. I have not checked lately, but Murdoch and FOX corporation for several years in a row funded at a 50% or 60% level virtually all of the National Geographic specials produced. Major museums known for their great exhibits are often funded by the very corporations or individuals that the people who love those exhibits are (often justifiably) suspicious of. The great importance of private corporate or individual funding is also a factor for art museums, cultural entities like the Opera or Symphony, and of course, sports teams.

This is also true of educational institutions. You see this most obviously at schools of business or management. Say you want to visit the Carlson School of Management at the University of Minnesota. It is named after Curtis Carlson, who was Chair of the Carlson Companies (Radisson). Curt also owned TGI Fridays. You might park in the Toyota Parking lot. Perhaps you are going to a meeting at the Medtronic Dining Room followed by a lecture at the Honeywell Lecture Hall. Later, for entertainment you might catch a game at Target Field, or Target Center, or the Xcel Energy Center. Or perhaps you’ll visit the Opera or Symphony. While you are there, be sure to check out the Wall of Donors to see the numerous large companies (mostly Minnesota based) or wealthy individuals who make big donations there.

Well, OK, you probably already knew that large corporations and wealthy individuals are footing the bill for many of the trappings of our civilization, including educational enterprises, and ranging from academics to high culture to sports.

Lately there has been concern that the mix of large donors and missions of various institutions represents a conflict of interest, especially with regards to climate change and global warming.

We’ve seen the Harvard Smithsonian Center for Astrophysics as a conduit for moving money from Big Fossil (large corporations that depend, we presume, on the rejection of climate change science) to scientists who produce roundly criticized work used by climate change denialist in Congress (via the mechanism of Congressional testimony) to avoid implementing science-sound energy and environmental policies.

It has been argued that the David Koch human evolution exhibit at the Smithsonian inappropriately downplays the critical role of human caused climate change as a problem facing our species. The exhibit does mention future challenges, and a warming planet, but conveniently leaves off the anthropogenic part.

A couple of years back, the University of Minnesota bailed out of showing a documentary on the Mississippi River, which included quite a bit of material on pollution of the river caused by agriculture, allegedly because Big Ag interests pressured the administration. It has been suggested that was only one of several examples of The U bending to the agricultural industry.

Recently there has been a move to ask natural history museums to reduce or eliminate funding from Big Fossil, and to ask folks like the Kochs to not be on their boards of directors. This makes sense because of the potential conflict of interest, but it could also be a form of institutional suicide if the funding from those sources is both very important and irreplaceable.

How much of the science done by major academic institutions is influenced by funding? It makes sense, for example, for Big Ag to fund laboratories, graduate fellowships, and research at these institutions because they benefit from the training and research. But it might also make sense for Big Ag to influence what research is done, perhaps who gets the results, and most importantly perhaps, what research (or results) is NOT funded, or repressed. Same with Big Fossil. Same with Big Pharm. Same with Big Whatever.

And, of course, the same can be said of large museums. I can name one large museum (but I won’t) that totally avoids human evolution (but not necessarily evolution in general) because there are private donors who don’t think humans evolved. The aforementioned human evolution exhibit funded by Koch is probably a mild example of bias. I’ve seen a lot of human evolution exhibits, and so far the few that are quite willing to challenge visitors’ religious or other anti-science beliefs were entirely state funded, as far as I know.

I think it is appropriate to ask the Smithsonian to dump the Kochs and their ilk as donors and board members, because such stark request can form the core of an activist approach that could cause positive change. But I also think we need to recognize the difficult position these institutions are in. We need not only to tell them to change how they do things, but to suggest alternative approaches and facilitate those approaches. Big educational exhibits at museums should routinely be funded by public money, as many already are. Perhaps private donations should be funneled through third parties that are devoid of nefarious intentions and shady ties. One approach in the US might be to tie tax benefits to such a thing. You can get a tax benefit from donating to a museum to produce an exhibit, but you get a better tax benefit if you donate to the NSF or NIH museum exhibit and educational endowments, which are in turn distributed via the usual mechanism of carefully developed requests for proposals with peer review. That would let the Kochs have part of their cake and we (the citizens) get to eat the other part.

The way research, education, and public engagement is funded has become a problem. What do you think? How should we solve this problem?

Antarctic Ice Shelves Melting at Accelerating Rate

Antarctica is pretty much covered with glaciers. Glaciers are dynamic entities that, unless they are in full melt, tend to grow near their thickest parts (that’s why those are the thickest parts) and mush outwards towards the edges, where the liminal areas either melt (usually seasonally) in situ or drop off into the sea.

Antarctic’s glaciers are surrounded by a number of floating ice shelves. The ice shelves are really the distal reaches of the moving glaciers floating over the ocean. This is one of the places, probably the place at present, where melting accelerated by human caused greenhouse gas pollution occurs. The ice shelves are fixed in place along their margins (they typically cover linear fjord like valleys) and at a grounding point underneath the shelf some distance form the ice margin but under sea level.

The collapse or disintegration of an ice shelf is thought to lead to the more rapid movement of the corresponding glacial mass towards the sea, and increased melting. This is the big problem right now with estimating the rate of glacial melting in the Antarctic. This is not a steady and regular process, as rapid disintegration of an ice shelf is possible. Most likely, Antarctic glacial melting over the coming decades will involve occasional catastrophic of an ice shelf followed by more rapid glacial melting at that point.

Unfortunately, the ice shelves are generally becoming more vulnerable to this sort of process, a new study just out in Science shows. From the abstract:

The floating ice shelves surrounding the Antarctic Ice Sheet restrain the grounded ice-sheet flow. Thinning of an ice shelf reduces this effect, leading to an increase in ice discharge to the ocean. Using eighteen years of continuous satellite radar altimeter observations we have computed decadal-scale changes in ice-shelf thickness around the Antarctic continent. Overall, average ice-shelf volume change accelerated from negligible loss at 25 ± 64 km3 per year for 1994-2003 to rapid loss of 310 ± 74 km3 per year for 2003-2012. West Antarctic losses increased by 70% in the last decade, and earlier volume gain by East Antarctic ice shelves ceased. In the Amundsen and Bellingshausen regions, some ice shelves have lost up to 18% of their thickness in less than two decades.

This is one of many reasons that even the most extreme of the IPCC estimates of ice loss (generally) and its contribution to sea level rise have to be seen as a lower limit. This is a substantial change, and it is very recent. It isn’t just that the ice sheets have gotten thinner, but also, that the rate of melting at these margins is increasing.

Caption to figure: Fig. 1 Eighteen years of change in thickness and volume of Antarctic ice shelves.
Rates of thickness change (m/decade) are color-coded from -25 (thinning) to +10 (thickening). Circles represent percentage of thickness lost (red) or gained (blue) in 18 years. Only significant values at the 95% confidence level are plotted (see Table S1). Lower left corner shows time series and polynomial fit of average volume change (km3) from 1994 to 2012 for the West (in red) and East (in blue) Antarctic ice shelves. Black curve is polynomial fit for All Antarctic ice shelves. We divided Antarctica into eight regions (Fig. 3), which are labeled and delimited by line segments in black. Ice-shelf perimeters are shown as a thin black line. The central circle demarcates the area not surveyed by the satellites (south of 81.5°S). Original data were interpolated for mapping purposes (see Table S1 for percentage area surveyed of each ice shelf). Background is the Landsat Image Mosaic of Antarctica (LIMA).

Science Museums: Cut Ties to Big Carbon, Kick Out The Kochs!

There is a letter signed by top scientists demanding that science museums cut all their ties to Big Fossil, and where appropriate, kick the Koch Brothers off their boards.

The letter says, in part,

As members of the scientific community we devote our lives to understanding the world, and sharing this understanding with the public. We are deeply concerned by the links between museums of science and natural history with those who profit from fossil fuels or fund lobby groups that misrepresent climate science.

Museums are trusted sources of scientific information, some of our most important resources for educating children and shaping public understanding.

We are concerned that the integrity of these institutions is compromised by association with special interests who obfuscate climate science, fight environmental regulation, oppose clean energy legislation, and seek to ease limits on industrial pollution.

You can read the entire letter and see the signers here.

For example, Big Fossilman David Koch is a major donor to the Smithsonian and sits on its board Board. That’s the same Smithsonian who harbors science denialist Willie Soon and at least one other denier.

“It is one thing for David Koch to give money to Lincoln Center or Carnegie Hall, but it is quite another to support a science/natural history museum that has a role to play in doing research on, and helping educate the public about, climate change, the greatest threat ever to confront humanity”, said signer and Nobel laureate Eric Chivian. “The philanthropy serves to silence any criticism of the practices of the donor, and even, possibly, any critical discussion of the issue.”

“Energy companies and the Koch brothers gain social license from their association with these scientific institutions. It gives them cultural capital and credibility as supporters of science, yet they fund scientists and lobby groups that spread climate science disinformation and block action on climate change.” says Beka Economopoulos, director of The Natural History Museum, which organized the letter.

Climate scientist Michael Mann, who also signed, noted that “corporate polluters are embedding themselves in these spaces that communicate science to the public. Cloaked in the garb of civic-mindedness, they launder their image while simultaneously and covertly influencing the content offered by those institutions. It’s a public relations move of the highest order. David Koch sits on the board of our nation’s largest and most respected natural history museums, while he bankrolls groups that deny climate science. There is a clear contradiction between the mission of these museums and the politics of their patron.”

There is a petition that goes along with the letter that you might want to sign. It is HERE.

Drought in California and Climate Change: They are linked

A paper just out now in PNAS by Noah Diffenbaugh, Daniel Swain, and Danielle Touma shows that “Anthropogenic warming has increased drought risk in California.” From the abstract:

… We find that although there has not been a substantial change in the probability of either negative or moderately negative precipitation anomalies in recent decades, the occurrence of drought years has been greater in the past two decades than in the preceding century. In addition, the probability that precipitation deficits co-occur with warm conditions and the probability that precipitation deficits produce drought have both increased. Climate model experiments with and without anthropogenic forcings reveal that human activities have increased the probability that dry precipitation years are also warm. Further, a large ensemble of climate model realizations reveals that additional global warming over the next few decades is very likely to create ?100% probability that any annual-scale dry period is also extremely warm. We therefore conclude that anthropogenic warming is increasing the probability of co-occurring warm–dry conditions like those that have created the acute human and ecosystem impacts associated with the “exceptional” 2012–2014 drought in California.

Michael Mann and Peter Gleick have written a commentary for PNAS to accompany that research. The graphic at the top of the post is from that study. They note:

California is experiencing extreme drought. Measured both by precipitation and by run- off in the Sacramento and San Joaquin river basins, 10 of the past 14 y have been below normal, and the past 3 y have been the driest and hottest in the full instrumental record. A plot of temperature and precipitation anom- alies over the full instrumental record from 1895 through November 2014 shows that the 3-y period ending in 2014 was by far the hottest and driest on record (Fig. 1). As of the publication of this commentary, the state appears headed into a fourth consec- utive year of water shortfall, leading to massive groundwater overdraft, cutbacks to farmers, reductions in hydroelectricity gen- eration, and a range of voluntary and man- datory urban water restrictions.

A number of studies have examined the California drought to try to determine if it was “caused by” (or otherwise affected by) human greenhouse gas pollution. These studies vary in their level of attribution, but increasingly it is becoming clear that anthropogenic global warming has a very big hand in this.

Mann and Gleick tackle the problem of defining drought. There are multiple ways to do so, and they relate to different causes. The plethora of definitions and relevant variables allows for a given study to miss any global warming effect by picking certain factors and ignoring others. Studies that look mainly at inputs to the hydrological system (i.e., rainfall) tend to miss the output part of the equation, including evaporation, which is exacerbated by a warming climate. Mann and Gleick point out that the Diffenbaugh study adds significant weight to the idea that anthropogenic climate change has increased the frequency, magnitude, and duration of California’s droughts. Perhaps more importantly, the Diffenbaugh study suggests “the emergence of a climatic regime in which all future dry years coincide with warmer conditions.”

Gleick told me, “The scientific evidence showing the growing influence of climate changes on extreme events around the world, including the ongoing California drought, continues to pile up. The clearest piece of this is the record high, and increasing, temperatures, which directly influence the availability and demand for water, but there is also growing evidence that climate change is influencing pressure dynamics and atmospheric circulation patterns that either bring, or divert, water from the west coast of the United States.”

So, the current drought in California is linked to human induced climate change, and in the future, this will be a more common phenomenon than it has in the base, according to the best available science. But what about other effects of climate change? I asked Michael Mann about the relationship between California Drought and his recent study showing that we should soon be entering a period (over the next couple of decades) during which heat that has been hiding in the oceans will be leaving it’s watery milieu and joining us up here on the surface. He told me, “Here is the linkage I think is most relevant: the “faux pause”, in our recent study, was closely tied to the predominance of La Nina-like conditions in the tropical Pacific for a large part of the past decade, and these same conditions are closely linked with California drought (La Nina years tend to be drought years in California, while El Nino years tend to be wet years—though this doesn’t necessarily hold true for every single event). So one might imagine that a return to a greater tendency for El Nino-like conditions in the tropical Pacific over the next decade or two (which would spell an end to the “Faux Pause”) could actually be a mitigating effect as far as California drought is concerned. A bit counter-intuitive, but that’s best assessment here.”

New Research Shows Exceptional Slowdown In Major Atlantic Ocean Currents (UPDATED)

Climate scientists have noticed a disturbing pattern in the North Atlantic. This is the relative cooling of surface waters in the area fed by the Gulf Stream. This pattern has emerged over recent decades, and may portend very rapid and potentially disruptive climate change in the upcoming decades. The cooling is not subtle at all, and looks like this:

Map based on NASA GISS data of warming 1901-2013
Map based on NASA GISS data of warming 1901-2013

So what does this mean? A paper out just today describes, explains, and discusses this odd anomaly and its potential consequences. First, a bit of context.

The Earth’s climate follows certain patterns. Most obviously it is warmer at the equator, colder at the poles. Less obvious if you’ve not looked into this is the presence of a very wet band around the middle of the earth, flanked to the north and south by irregular dry bands (that’s where most of the deserts are), with these flanked by the temperate zone, where you have more moisture and highly seasonal temperatures, and so on.

This pattern emerges as a complex response to two major inputs. First the Earth is spinning, and second, the Earth is heated more at the equator than the poles, so heat must move through air and water currents towards the north and south.

One of the major systems that moves heat away from the equator is known sometimes as the Atlantic Conveyor, which is really part of a lager system of sea currents that includes the Gulf Stream. Notice that the Indian Ocean is sequestered mostly in the Southern Hemisphere, bordered along the west by Africa and the north by Asia. Extra warm water in the Indian ocean tends to make its way around the southern tip of Africa, and up the Atlantic, which is a round about route. This water eventually makes its way to the North Atlantic, where it cools, and owing to evaporation, becomes extra salty. This drives the formerly warm surface water into the depth of the ocean, where it flows along the bottom of the Atlantic south, eventually returning (I oversimplify a bit) to the Indian Ocean and elsewhere.

This system is also known as the AMOC (Atlantic Meridional Overturning Circulation) and is part of the global “Thermohaline Circulation” system.

Meanwhile, a smaller but similar aspect of this system starts with the Gulf of Mexico. This water becomes quite warm from the Sun, but is blocked from moving directly north by the presence of North America, with Florida adding to the captive nature of those waters. But the water does make its way around Florida and flows north along the East coast of the US, and eventually also reaches the North Atlantic, and similarly, contributes to the saline deep currents.

Because salinity partly, even largely, drives this system, adding fresh water to the North Atlantic may interfere with this system of currents. How do you get enough fresh water to do this? In the past, huge volumes of fresh water probably entered the North Atlantic every now and then as large outflows of giant inland lakes, formed by melting glaciers, broke through barriers of ice or sediment. There is some evidence that in the past this sort of thing may have partly, or even completely, shut down the Atlantic Conveyor system, which would have had huge impacts on climate.

Today there seems to be two main sources of extra fresh water in the area. One is during years (or decades) when there is a larger than usual number of ice bergs floating into the North Atlantic from the Arctic. The other, potentially, is from melting of Greenland’s fast glaciers, a process that has recently speeded up because of human caused greenhouse gas pollution warming the Earth.

By now you may recognize this scenario as the basis for the Hollywood disaster movie “The Day After Tomorrow.” In that movie the thermohaline circulation system shut down and an ice age instantly gripped the planet. Giant frozen tornadoes came plummeting down from the Stratosphere. One of them hit the helicopter the British Royal Family was escaping in. Everybody in the US ended up in Mexico.

Every one who survived, that is.

The thing is, now, this can’t happen. Well, that particular scenario can’t ever really happen. But yes, the shutting down of this system can theoretically cause the onset of an ice age, or at least a mini-ice age, and has done so in the past. But no, it can’t now because our planet has warmed too much from human greenhouse gas pollution to allow that to happen. That may be the one good thing about global warming.

The new research does suggest, though, that this major pattern of circulation appears to be slowing down. This will have a number of effects. It will likely change the weather in Europe a bit. It will likely cause an increase in sea level along the US East Coast, because the current (and former) system piles up water towards the east and lowers it in the west, within the North Atlantic. That could be worth a few inches.

According to lead author Stefan Rahmstorf, “It is conspicuous that one specific area in the North Atlantic has been cooling in the past hundred years while the rest of the world heats up. Now we have detected strong evidence that the global conveyor has indeed been weakening in the past hundred years, particularly since 1970,” says Rahmstorf. If the slowdown of the Atlantic overturning continues, the impacts might be substantial. Disturbing the circulation will likely have a negative effect on the ocean ecosystem, and thereby fisheries and the associated livelihoods of many people in coastal areas. A slowdown also adds to the regional sea-level rise affecting cities like New York and Boston. Finally, temperature changes in that region can also influence weather systems on both sides of the Atlantic, in North America as well as Europe.”

The researchers used a combination of sea surface, atmospheric, and proxy (mainly coral) indicators of temperature to indirectly measure changes in ocean currents over time.

According to climate scientist Jason Box, “Now freshwater coming off the melting Greenland ice sheet is likely disturbing the circulation. So the human-caused mass loss of the Greenland ice sheet appears to be slowing down the Atlantic overturning – and this effect might increase if temperatures are allowed to rise further.” Michael Mann, another author of the paper, adds, “Common climate models are underestimating the change we’re facing, either because the Atlantic overturning is too stable in the models or because they don’t properly account for Greenland ice sheet melt, or both. That is another example where observations suggest that climate model predictions are in some respects still overly conservative when it comes to the pace at which certain aspects of climate change are proceeding.”

What happens if the system actually turns off completely? It was formerly thought that the chances of this happening were small, but this research, conforming to a growing body of expert opinion, suggest that the chances of that may be higher than previously thought. Were this to happen the main characteristic of any effects would be rapidity. Whatever happens would happen fast, and rapidly changing climate is generally regarded as bad no matter what the change itself really is.

UPDATE ADDED:

A criticism of this work has emerged, suggesting that another study indicates that there is no a long-term slowdown of the Atlantic Meridional Overturning Circulation (as suggested by the research covered here). That criticism is incorrect. Michael Mann, one of the AMOC study’s author has written a clarification on his facebook page. He begins:

Some critics have tried to make hay over a previous article from last year by URI Graduate School of Oceanography scientist Tom Rossby (see: http://www.gso.uri.edu/b…/rossby-gulf-stream-is-not-slowing/) they claim contradicts our recent Nature Climate Change study finding evidence for a long-term slowdown of the Atlantic Meridional Overturning Circulation (“AMOC”). …

Rossby employs direct measurement of Gulf Stream transport using a ship-board acoustic Doppler current profiler (ADCP) over the interval 1993-2012. I have no reason at all to doubt Rossby’s findings. And they do *not* conflict with our own findings (though some have misleadingly sought to assert they do) for two fundamental reasons:

Mann’s entire post is HERE and you should go read it.

Additional Resources:

The article:

Rahmstorf, S., Box, J., Feulner, G., Mann, M., Robinson, A., Rutherford, S., Schaffernicht, E. (2015): Evidence for an exceptional 20th-Century slowdown in Atlantic Ocean overturning. Nature Climate Change (online) [DOI: 10.1038/nclimate2554 ]

Stefan Rahmstorf, lead author, has this blog post at RealClimate: What’s going on in the North Atlantic?

Figure caption from the original article, goes with the graphic at the top of the post:

Figure 3. Surface temperature time series for different regions. Data from the proxy reconstructions of Mann et al.12,13, including estimated 2-? uncertainty bands, and from the HadCRUT4 instrumental data49. The latter are shown in darker colours and from 1922 onwards, as from this time on data from more than half of all subpolar-gyre grid cells exist in every month (except for a few months during World War II). The orange/red curves are averaged over the subpolar gyre, as indicated on Fig. 1. The grey/black curves are averaged over the Northern Hemisphere, offset by 3 K to avoid overlap. The blue curves in the bottom panel show our AMOC index, namely the difference between subpolar gyre and Northern Hemisphere temperature anomalies (that is, orange/red curves minus grey/black curves). Proxy and instrumental data are decadally smoothed.

A neat video of the thermohaline circulation system.

A movie produced by Peter Sinclair, that goes along with THIS blog post.

Coverage by Chris Mooney at the Washington Post: Global warming is now slowing down the circulation of the oceans — with potentially dire consequences


Other posts of interest:

    – [Important new meta-study of sea level rise in the US.](http://scienceblogs.com/gregladen/2014/09/05/important-new-meta-study-of-sea-level-rise-in-the-us/)
    – [Whatever you thought about sea level rise, it’s worse than you were thinking.](http://scienceblogs.com/gregladen/2013/12/05/whatever-you-thought-about-sea-level-rise-its-worse-than-you-were-thinking/)
    – [How high can the sea level rise if all the glacial ice melted?](http://scienceblogs.com/gregladen/2013/06/18/how-high-can-the-sea-level-rise-if-all-the-glacial-ice-melted/)
    – [Bangladesh and Sea Level Rise](http://scienceblogs.com/gregladen/2013/04/29/bangladesh-and-sea-level-rise/)

Also of interest: In Search of Sungudogo: A novel of adventure and mystery, set in the Congo.

Shut. Up. NOAA says February 2nd Warmest on Record

And by “Shut. Up.” I mean shut up about whether or not global warming is real.

The National Climatic Data Center of NOAA has just released the number representing the Earth’s surface temperature for February and it is a shocking 0.82 degrees C above the 20th century average. This is the second warmest Feb on record, and I’m pretty sure it is the third warmest month on record, in that data base.

Roughly speaking, the coldest we have experienced, the Last Glacial Maximum, was about 6 degrees colder than at present. It is generally thought that we need to keep the global surface temperature below about 2 degrees C above the preindustrial level. The 20th century average used by NOAA includes decades warmed considerably by human caused greenhouse gas pollution; the actual difference between the best estimate (BEST 1750-1850) of the pre-industrial temperature and the present is about 1.5 degrees C.

This high value is not surprising since the estimate of the Earth’s surface temperature from NASA GISS, which came out a few days ago, also places February very high in temerature.

March, by the way, has been pretty warm. We’ve had quite a trend of very warm months, and it appears to not be letting up.

For a comparative perspective, see also this.