Monthly Archives: May 2015

Denial 101x Week 3

A few notes from Week 3 of Denial 101x: Making Sense of Climate Science Denial. These notes are mainly about the science and not the denialism part (unlike my last post, which addressed the central theme of the course, denialism, more.)

The Carbon Cycle

Atmospheric CO2 concentrations have gone up by about 40%. Simple explanation: Humans are releasing Carbon into the atmosphere by burning fossil fuel. More complex explanation: Humans are affecting the Carbon Cycle in a number of ways, releasing Carbon (burning fossil fuels) as well as affecting natural Carbon sinks.

This became known, observed, ca 1958. About half of the CO2 we release contributes to the extra atmospheric CO2. Myth: Since the amount of Carbon that cycles naturally through the system is so large, human influence must be negligible. Good analogy using a bank account. Nature has been a net Carbon sink over the last 50 years, but we are still seeing atmospheric CO2 going up.

A video by Gavin Cawley:

Andy Skuce discussed the role of volcanoes in comparison to human effects. Great discussion of the big picture for volcanoes. Did you know that most of the volcanoes are under the sea? But on balance the way undersea volcanoes work actually results in a lot of CO2 being consumed there. Mt Etna is one of the more prolific CO2 producers, but is still smaller than human fossil fuel burning in nearby Sicily. Also, 1/3rd of land based volcano CO2 is off set by weathering of volcanic surfaces. Humans release 60-100 times more Carbon than volcanoes.

Gavin Cawley talks more about the Carbon Cycle, asking the question, how long would it take the Carbon Cycle to return CO2 to pre-industrial levels is humans got out of the game today. Myth: CO2 has a short lifespan in the air. A given CO2 molecule may have a short lifespan but the amount of CO2 does not change quickly. Jelly Beans are invoked in a helpful analogy. Answer: It is a slow process that will take 50-200 years for much of it to return, but the total adjustment time is a long time, thousands of years. Also, Gavin Cawley and his wife need to talk more about heir checkbook and their private jelly bean stash.

Dr Joanna House, Corinne Le Quéré, Professor Tim Osborn, Professor Dan Lunt, Professor Lonnie Thompson, Professor Pierre Friedlingstein and Professor Mauri Pelto talk about the Carbon Cycle. Seasonal cycles, longer term cycles. Plants, ocean, etc. Natural fluxes are roughly in balance, human emissions provides a rapid positive flux. We have not had this CO2 level in 800 thousand years. NOTE: That does not mean that 800K years ago CO2 was at 400ppm. It is just that we have a good ice core record 800K long. To reach 400ppm you have to go back much farther in time, millions of years. The time scale for ocean mixing is thousands of years. About 65 – 80% of the human released CO2 will go away (if humans go away) in between 2 to 200 years. The remaining 35 percent (on average) will take thousands of years.

Greenhouse Effect

Mark Richardson: Pig picture, i.e., different planets. Good description of how the greenhouse effect works. Cool infrared camera demonstration. A pygerometer.

Mark Richardson: Looking at the first ever greenhouse effect myth, dates back to 1900. “Knut Ångström and his assistant did an experiment.” It seemed to show that the effects of CO2 could saturate. It does but not at actual relevant levels in the atmosphere.

Sarah Green about reinforcing feedback. Chicken-egg problems, causality. What do Ice Cores say about changes in the Carbon Cycle over long periods of time. Does warming increase CO2, or does increased CO2 cause warming? Yes. False dichotomy myth. This is pretty important and a bit complex, so I’ll just put the video here:

From the experts, Professor Naomi Oreskes, Dr Ed Hawkins, Professor Mike Mann, Professor Simon Donner, Professor Richard Alley, Professor Eric Rignot, Professor Jonathan Bamber and Professor Lonnie Thompson talk about the greenhouse effect, the history of research on the greenhouse effect, etc. Excellent video to show Uncle Bob:

Fingerprints

Mark Richardson examines the fingerprint of changing structure of heat distribution in the atmosphere, (including the famous Tropical Hotspot, not a real fingerprint, though may be it is like a partial print). (See this recent post on a related topic.)

Sarah Green on Satellite measurements of outgoing radiation. This is a key fingerprint of changes in energy balance.

Are you taking the course? You should check it out, here.

Secular Americans are more numerous than Catholic Americans

The “Nones” are rising, at the expense of the Nuns.

From “Openly Secular“:

Recently, two studies have been released that affirm the number of nonreligious Americans is rising. For several years, the percentage of secular Americans has been increasing rapidly and the media has been reporting on it. But dig a little deeper, and it becomes abundantly clear that this new data is fundamentally different and demonstrates a significant shift in the hearts and mind of American citizens, and critically, American voters.

These new studies, one conducted by the Pew Research Center on Religion & Public Life, the other by the Public Religion Research Institute (PRRI) reveal that not only are there more religiously unaffiliated Americans (often called “nones”), but more of these people are calling themselves atheists or agnostics. At the same time, the number of Americans who identify as Christians is shrinking.

Since 1993, the percentage of Americans who claim no formal religious affiliation has grown from 9% to 22% according to the PRRI study. No other group has risen as sharply. Not only is the overall number of unaffiliated Americans surging, a greater proportion of these “nones” are identifying as secular. The Pew study shows that 31% of the “nones”—representing 17 million Americans—self identify as atheist or agnostic, up from 25% in 2007. An additional 39% of the “nones” say that religion is not important to them, which means 15.8% of the total population of the United States is atheist, agnostic, or secular.

Almost sixteen percent of potential voters carries a lot of political clout. A group of voters this large who believe in the separation of church and state simply cannot be ignored.

“This data is particularly meaningful as the 2016 presidential election approaches,” says Todd Stiefel, Chair of Openly Secular. “For the first time, politicians will not be able to ignore the substantial voting bloc of secular Americans. We have real clout and we vote. As the number of secular people continues to rise, more and more Americans will come to realize that many of their loved ones, next-door neighbors and coworkers do not believe in God, and that they are still moral, good people.”

Seepage: Climate change denial and its effect on the scientific community

The title of this post is also the title of a new peer reviewed paper by Stephan Lewandowsky, Naomi Orskes, James Risbey, Ben Newell and Michael Smithson, published in Global Environmental Change. The article is Open Access, available here. Stephan Lewandosky has a blog post on it, in which he notes,

… we examine the effect of contrarian talking points that arise out of uncertainty on the scientific community itself. We show that although scientists are trained in dealing with uncertainty, there are several psychological and cognitive reasons why scientists may nevertheless be susceptible to uncertainty-based argumentation, even when scientists recognize those arguments as false and are actively rebutting them….

We highlight three well-known psychological mechanisms that may facilitate the seepage of contrarian memes into scientific discourse and thinking: ‘stereotype threat’, ‘pluralistic ignorance’ and the ‘third-person effect’.

Stereotype threat refers to the emotional and behavioural responses when a person is reminded of an adverse stereotype against a group to which they belong. Thus, when scientists are stereotyped as ‘alarmists’, a predicted response would be for them to try to avoid seeming alarmist by downplaying the degree of threat. There are now several studies that highlight this tendency by scientists to avoid highlighting risks, lest they be seen as ‘alarmist.’

Pluralistic ignorance describes the phenomenon which arises when a minority opinion is given disproportionate prominence in public debate, resulting in the majority of people incorrectly assuming their opinion is marginalized. Thus, a public discourse that asserts that the IPCC has exaggerated the threat of climate change may cause scientists who disagree to think their views are in the minority, and they may therefore feel inhibited from speaking out in public.

Finally, research shows that people generally believe that persuasive communications exert a stronger effect on others than on themselves: this is known as the third-person effect. However, in actual fact, people tend to be more affected by persuasive messages than they think. This suggests the scientific community may be susceptible to arguments against climate change even when they know them to be false.

I have little to add beyond Stephan’s overview (Sou has this, go check it out), and you can read the paper itself. I do think these questions are part of an even larger issue, of the influence of systematic well funded constant denialism on both the science and the implementation of science as public policy. Imagine if all the effort spent addressing contrarian claims was spent on doing more science or the translation of science into policy? One could argue that questioning the science strengthens it, and in many cases that may be true. But the denialism about climate science does not play that role. Contrarian arguments are not valid questions about the science, but rather, little more than self indulgent contrived nefarious sophistic yammering. That is not helpful; It does not strengthen the science. Rather, denialism has served to slow down the implementation of sensible energy policies, and has probably slowed down our collective effort to the extent that were the denialism ignored, or didn’t exist to begin with, we would be decades ahead of where we are now in addressing the existential issue of our time.

Close Call: Amtrak Train May Have Nearly Hit Oil Train

Oil train derailments are becoming more common, mainly because of the very large number of oil trains, often with over 100 tank cars, taking oil out of the Bakken fields and bringing it to coastal refineries or storage facilities.

You are certainly aware of the recent Amtrak derailment in Pennsylvania. From Reuters:

An Amtrak train in Philadelphia was traveling at more than 100 miles per hour, over twice the speed limit, when it entered a curve in the tracks and derailed, killing seven people and injuring more than 200, federal investigators said on Wednesday.

Now, Patrick Kerkstra at Philadelphia Magazine (Citified) is reporting that the train may have come dangerously close to colliding with an oil train.

There’s a terrifying sight in the background of some photos from Tuesday night’s horrific Amtrak derailment: a series of black rail cars, the color and shape of Tootsie Rolls.

They look like standard rail tanker cars, and while we don’t yet know for certain what was in them, it probably wasn’t corn syrup. In fact, there’s a good chance those tankers were filled with crude oil.

(The image is at the top of the post.)

Kerkstra notes that between 45 and 80 of these trains pass through Philadelphia weekly.

Another photograph at Philidelphia Magazine shows the derailed Amtrak trains crossing a track with at least one oil car on it, but a few hundred feet up track, and other oil trais on adjoining tracks that were not overrun by the Amtrak train. Looking at that photo, it appears to be dumb luck that there didn’t happen to be any oil train cars right where the Amtrak train cars eventually came to rest.

Another photograph, tweeted, shows what looks like the lead Amtrak car, which travelled much farther from the tracks, just a few feet from an oil train.

This is Conrail’s train yard. When asked what was in the tanker cars, Conrail is reported to have claimed that this information is confidential. The Philadelphia Magazine report indicates that an unattributed NTSB person was told that the oil cars were empty. Empty oil cars are, of course, not necessarily empty of explosive fumes and residue, and this report is not confirmed.

The fact is, it didn’t happen. But another fact is, apparently, that it could have.

A Global Warming Expectation Confirmed: Upper Troposphere Warming

Note: The original title of this post was “A Global Warming Fingerprint Confirmed: Upper Troposphere Warming” because I was thinking that upper troposphere warming was a fingerprint. John Cook contacted me to let me know that he didn’t think it was. The reason it is not is that more than one thing can cause upper tropospheric warming, not just AGW. However, it does turn out to be more complicated than that. Various people claiming that a lack of UT warming was evidence of no warming have now been shown wrong, but even a lack of warming is not, if you will, an anti-AGW fingerprint. In the end it turns out to be a very complicated phenomenon and it would probably take me five blog posts to adequately related the conversation I’ve had over the last 24 hours with various climate scientists about the details.

John Abraham will be having something on this in the Guardian very soon, I’ll put a link here. An now, back to the original post:

Global warming is real, and caused by the release of human generated greenhouse gas pollution. We can measure the greenhouse gas concentrations (mainly CO2) and we can measure surface warming and upper ocean warming. But global warming should have a number of additional indicators, predicted by modeling or other aspects of climate science, and identifying those indicators both confirms the overall idea of global warming and helps us understand its effects.

The troposphere is the lower layer of the Earth’s atmosphere, where about three quarters of the air and most of the water vapor resides, and it is about 7 to 20 kilometers thick, thickest in the tropics and thinest near the poles. Climate models and thermodynmaic calculus predict that the upper troposphere in the tropics should experience warming. Temperatures at this altitude are measured using radiosonde technology. This is a package of instruments sent aloft on a weather baloon. But there are a lot of problems with the data. Instruments that measure temperature are ideally situated in one spot and properly enclosed. Such instruments dangling off the end of a balloon flying through the air are subject to heating and cooling from various uncontrolled effects, such as sunlight (vs. not), air movements, etc. Given the low quality of the data, it has been hard to observe upper troposphere temperatures.

Satellite measurement of temperature variation in the troposphere are not as useful as one might hope, because those instruments tend to average out temperatures over a much larger area of air than suitable for measuring the expected warming.

The new study, Atmospheric changes through 2012 as shown by iteratively homogenized radiosonde temperature and wind data (IUKv2), by Steven Sherwood and Nidhi Nishant, takes a new approach. From the Abstract:

We present an updated version of the radiosonde dataset homogenized by Iterative Universal Kriging (IUKv2)…This method, in effect, performs a multiple linear regression of the data onto a structural model that includes both natural variability, trends, and time-changing instrument biases, thereby avoiding estimation biases inherent in traditional homogenization methods. One modification now enables homogenized winds to be provided for the first time. ..Temperature trends in the updated data show three noteworthy features. First, tropical warming is equally strong over both the 1959–2012 and 1979–2012 periods, increasing smoothly and almost moist-adiabatically from the surface (where it is roughly 0.14 K/decade) to 300 hPa (where it is about 0.25 K/decade over both periods), a pattern very close to that in climate model predictions. This contradicts suggestions that atmospheric warming has slowed in recent decades or that it has not kept up with that at the surface. Second, as shown in previous studies, tropospheric warming does not reach quite as high in the tropics and subtropics as predicted in typical models. Third, cooling has slackened in the stratosphere such that linear trends since 1979 are about half as strong as reported earlier for shorter periods.

This graphic shows the relative warming of the upper troposphere in the tropics:

Upper_Troposphere_Warming_In_Tropics

This is not the first study showing this, but rather, a significant update clarifying the observations.

Several modifications have been introduced, which have not had a large effect on estimated long-term trends in temperature but have enabled us to present a homogenized wind dataset in addition to that for temperature.

The warming patterns shown in the revised dataset are similar to those shown in the original study except that expected patterns now appear somewhat more clearly. These include a near-moist-adiabatic profile of tropical warming with a peak warming rate of 0.25–0.3 K/decade near 300 hPa since either 1959 or 1979. This is interesting given that (a) many studies have reported less-than-expected tropospheric warming, and (b) there has been a slowing of ocean surface warming in the last 15 years in the tropics. We support the findings of other recent studies … that reports of weak tropospheric warming have likely been due to flaws in calibration and other problems and that warming patterns have proceeded in the way expected from models. Moreover our data do not show any slowdown of tropical atmospheric warming since 1998/99, an interesting finding that deserves further scrutiny using other datasets

Another outcome of this research is the observation of increased winds in the Southern Hemisphere. This increase may be related to Ozone depletion at the southern end of the planet, a phenomenon that may also account for increasing extent of winter sea ice around Antarctica. Quoted in PhysOrg, author Steve Sherwood notes, “I am very interested in these wind speed increases and whether they may have also played some role in slowing down the warming at the surface of the ocean.”

John Abraham has also written up this research, here.

Since we were talking about “fingerprints” I thought you might like to watch this video by John Cook discussing global warming fingerprints. The video is from the MOOC “Denial101x Making Sense of Climate Science Denial

Bjorn Lomborg’s WSJ Response to Nixing of Australian Project

Bjorn Lomborg has written an Op Ed in the Wall Street Journal lamenting the decision of the University of Western Australia (UWA) to nix previously developed plans to accept a $4 million dollar payment from the conservative Australian government, to be matched by university money, to implement a version of Lomborg’s Copenhagen Institute there, to be known as Australia Consensus.

See: Bjorn Lomborg Is Wrong About Bangladesh And Sea Level Rise

See: Bjørn Lomborg WSJ Op Ed Is Stunningly Wrong

See: Are electric cars any good? Lomborg says no, but he’s wrong.

Lomborg’s scholarship in the area of climate and energy related policy has been repeatedly criticized and often described as far less than adequate. A typical Bjorn Lomborg missive on climate or energy policy seems to include instance after instance of inaccuracies, often taking the form of a statement of fact with a citation, where that fact or assertion is not to be found in the citation. Many regard his policies as “luke warm.” From the highly regarded Sketpical Science web site:

…examples of Luckwarmers include Matt Ridley, Nic Lewis, and Bjorn Lomborg. The University of Western Australia has been caught up in a major Luckwarmer controversy, having taken federal funds to set up a center from which Lomborg was expected to argue that the government’s money would be better spent on issues other than curbing global warming. In a sign that even Stage 3 climate denial is starting to become untenable, the resulting uproar forced the university to cancel plans for the center.

The UWA project received a great deal of critisim, and was seen by many as a move by Big Fossil to water down academic and government response to the critical issue of climate change. Graham Readfearn, writing for The Guardian, notes:

Danish political scientist and climate change contrarian Bjørn Lomborg says the poorest countries in the world need coal and climate change just isn’t as big a problem as some people make out.

Australia’s Prime Minister Tony Abbott says “coal is good for humanity” and there are more pressing problems in the world than climate change, which he once described as “crap” but now says he accepts.

So it’s not surprising then that the latter should furnish the former with $4 million of taxpayer funds to start an Australian arm of Lomborg’s Copenhagen Consensus Centre (CCC) at the University of Western Australia’s business school.

The Australian project was shut down after severe criticism from the global academic community as well as students and faculty within UWA. Predictably, Lombog had characterized this as an attack on free debate. From the Op Ed, “Opponents of free debate are celebrating. Last week…the University of Western Australia canceled its contract to host a planned research center, Australia Consensus, intended to apply economic cost-benefit analysis to development projects—giving policy makers a tool to ensure their aid budgets are spent wisely.

While Lomborg blames “activists” for shutting down the center, it is more widely believed that the project was criticized because, based on prior work done by Lomborg, any ensuing “cost-benefit analyses” would be academically weak and policy-irrelevant.

Central to the difference in overall approach (aside from allegations of poor scholarship) between Lomborg and many others is how poor or developing nations should proceed over coming decades. Lomborg seems to advocate that these nations go through the same economic and technological evolution as developed nations, building an energy infrastructure based mainly on fossil fuels, in order to industrialize and reach the standard of living presumed desired by those who live in those nations. The alternative, of course, is that development in these regions be done with lessons learned from the industrialized and developed world. We don’t ask rural Kenyans to install a wire-based analog phone system before using modern digital cell phone systems. With respect to energy, developing regions should implement clean energy with smart distribution rather than building hulking coal plants and committing for centuries to come to expensive and extensive electric grid systems that are now generally regarded as outdated.

Lomborg says enough about mitigating climate change effects, and developing green energy technologies, to be able to suggest that he supports these ideas when he is pushed up against the wall, as with the nixing of the Australian project. But his regular statements on specific policy points, frequent and well documented, tell a different story.

Lomborg claims that much of the policy development of the Copenhagen Institute is not even about climate change. To the extent that this is true, it may be part of the problem. As development occurs, energy is key. With development of energy technologies, climate change is key. Lomborg’s approach that the Copenhagen projects are mostly not about climate change is not an argument that he is doing something right. It is evidence that he is doing something wrong, and at the same time, is apparently unaware of this.

It is very important to remember, as this conversation unfolds, that the objections to Lomborg’s work, and to spending vast sums of money to support it, are only partly because of differences in approach. These objections also come from two other things. One is a sense that Lomborg is detached from scholarship and good analysis.

Graham Readfearn has documented academic response to Lomborg’s work. Here is one example:

Dr Frank Jotzo, director of the Centre for Climate Ecnomics and Policy at the Australian National University, was once invited to write a paper for Lomborg’s centre in 2008, which was sharply critical of how the cost of the impacts of climate change were treated.

He told me:

Within the research community, particularly within the economics community, the Bjorn Lomborg enterprise has no academic credibility. It is seen as an outreach activity that is driven by specific set of objectives in terms of bringing particular messages into the public debate and in some cases making relatively extreme positions seem more acceptable in the public debate.

And, regarding energy policy vis-a-vis the Big Fossil,

…we had a look at Lomborg’s claims that the world’s poorest were crying out for more fossil fuels which, Lomborg argued, were the only real way they could drag themselves out of poverty…the positions Lomborg takes on these issues are underpinned by a nasty habit of picking the lowest available estimates of the costs of climate change impacts.

Last year, when Lomborg spoke to a coal company-sponsored event in Brisbane in the shadow of the G20 talks, Lomborg suggested that because the International Energy Agency (IEA) had developed one future scenario that saw growth in the burning of coal in poor countries, in particular in sub-Saharan Africa, that this somehow meant that fossil fuels were just what they needed.

Yet Lomborg ignored an important rejoinder to that assessment, which had come from the IEA itself, and which I pointed out at the time.

The IEA said its assessment for Africa was consistent with global warming of between 3C and 6C for the continent by the end of this century.

Lomborg’s prior written works could be, and actually have been (I am told), used in coursework on analytical approaches to policy as bad, not good, examples. And, although Lomborg often associates himself with Nobel Prize Winners (and rarely fails to note that) he is not known as a high powered, influential scholar in his area. A recent citation analysis of Lomborg’s work backs up that concern:

…I combed through his Google Scholar entries and dumped all the duplicates, I ignored all the magazine and newspaper articles (e.g., you can’t count opinion editorials in The Wall Street Journal as evidence of an academic track record), I cut out all non-articles (things Lomborg hadn’t actually written), omitted any website diatribes (e.g., blog posts and the like) and calculated his citation profile.

Based on my analysis, Lomborg’s Google Scholar h-index is 4 for his peer-reviewed articles. If I was being particularly generous and included all of Lomborg’s books, which have by far the most citations, then his h-index climbs to 9. However, none of his books is peer-reviewed, and in the case of his most infamous book, The Skeptical Environmentalist, it has been entirely discredited. As such, any reasonable academic selection committee would omit any metrics based on opinion-based books.

So, the best-case scenario is that Lomborg’s h-index is no more than 4. Given his appointment to Level D (Associate Professor) at a world-class university, the suggestion that he earned it on academic merit is not only laughable, it’s completely fraudulent. There is no way that his academic credentials had anything to do with the appointment.

Even a fresh-out-of-the-PhD postdoc with an h-index of only 3 or 4 would have trouble finding a job. As a rule of thumb, the h-index of a Level D appointment should be in the 20–30 range (this would vary among disciplines). Despite this variation, Lomborg’s h-index is so far off the mark that even accounting for uncertainty and difference of opinion, it’s nowhere near a senior academic appointment.

The other problem people see with Lomborg’s efforts is the sense that the Copenhagen Institute is a bit of a sham, and that Lomborg is not selling informed expertise, but rather, snake oil. From a recent analysis of the status of the Copenhagen Consensus Center:

Copenhagen Consensus Center is a textbook example of what the IRS calls a “foreign conduit” and it frowns strongly on such things. It may also frown on governance and money flows like this…

CCCMoney2

…more than 60% went directly to Lomborg, travel and $853K promotion of his movie. According to Wikipedia it grossed $63K…

Even in a simple US charity, poor governance and obvious conflicts of interest are troublesome, but the foreign element invokes stringent extra rules. Legitimate US charities can send money to foreign charities, but from personal experience, even clearly reasonable cases like foreign universities require careful handling. It is unclear that Lomborg himself is a legitimate charity anywhere, but most of the money seems under his control. One might also wonder where income taxes are paid.

CCC seems to break many rules. Foreign citizen Lomborg is simultaneously CCC founder, president, and highest-paid employee. Most people are a little more subtle when trying to create conduits…

This is apparently the Copenhagen Consensus Center, Copenhagen Consensus Center USA, 262 Middlesex St, Lowell MA .
This is apparently the Copenhagen Consensus Center, Copenhagen Consensus Center USA, 262 Middlesex St, Lowell MA .
Both the flow of money and sources matter when thinking about a non profit research or policy institution. From DeSmog Blog:

A billionaire “vulture capitalist” and major backer of the US Republican Party is a major funder of the think tank of Danish climate science contrarian and fossil fuels advocate Bjørn Lomborg, DeSmogBlog has found.

New York-based hedge fund manager Paul Singer’s charitable foundation gave $200,000 to Lomborg’s Copenhagen Consensus Center (CCC) in 2013, latest US tax disclosures reveal.

That was about a third of the CCC’s donations for the year 2013.

Lomborg, who claims to not be a climate skeptic, is the author of “The Skeptical Environmentalist” and the book and movie “Cool It

Global Surface Temperatures Continue To Rise

Global warming is typically measured at the surface, with data from thermometers all across the land areas and sea surface temperatures combined. That isn’t the whole story, of course. Much of the added heat, an effect of human generated greenhouse gas pollution, goes into the upper 2,000 meters or so of the ocean. But we use the surface measurements to track global warming because we have the data for a long period of time, and those data in turn have been linked to longer running but less precise paleo data.

Almost every month for way over a year now has been warm, and April 2015 is no exception. The official NASA GISS surface temperature data is out for the month. April came in at an anomaly value of 75. That’s 0.75 degrees C above the base period for that data set.

That makes April the fifteenth warmest month in the database, which starts in 1880. March was warmer (fifth) and I’m pretty sure May is going to be warmer too (maybe as warm as March, we’ll see).

If we start in April and work backwards 12 months, we can get a one year long average. And, we can go back to the beginning of the data set to look at a 12-month running average for the entire thing. When we do that, we see that the current 12 month period, the one just ending, was the warmest ever in the data set, which makes it likely the warmest one year period in thousands and thousands of years. We have had several record breaking 12-month periods in a row. Here’s what that looks like:

GlobalSurfaceTemperature12-MonthMovingAverage_2015_April

Another way of looking at this is for the calendar year only. This is a graph of the average temperature for the first four months of the year for the entire NASA GISS data set, to give you an idea of where our Year-To-Date stands:

GlobalWarming_AverageTemperature_Jan-April

Will 2015 end up being a very warm year? Almost certainly. Unlike last year, we will see significant enhancements of surface heat from El Nino. This is probably already happening. Using only the year to date data to estimate 2015, it is a good sight warmer than any other calendar year on record. As noted, May will be warm (over 0.80) and subsequent months are likely to be even warmer if El Nino does pan out.

The following graphic indicates where the remaining months of the year will have to fall (on or above the horizontal red line) for 2015 to break the record set by 2014.

2014vs2015_How-much-warmer

Does Cell Phone Use Cause Cancer? No.

Josh Harkinson at Mother Jones recently posted an item called “Scores of Scientists Raise Alarm About the Long-Term Health Effects of Cellphones.” I like Josh’s work, but there are some problems with this article I want to point out, some of which parallel problems in the more general discussion of cell phone safety.

Before looking at the Mother Jones piece, here’s the bottom line: There is no known mechanism by which cell phone use can lead to cancer (usually, brain cancer is of concern). There have been many studies on this and related issues. They vary in quality and in what they look at. The studies that seem to indicate an increase in some kind of cancer with cell phone use would indicate a shift from a very very very unlikely chance of cancer to a very very unlikely chance of cancer. So if there is an effect reflected in this research, it is very small. The studies that seem to show a link are generally done by a limited group of researchers, use methodology that is not reliable and can not be used to attribute cause, and are situated within a literature that includes many studies that show no link. Different studies that may show a link between cell phone use and cancer often indicate a link to a different cancer. And, tellingly, brain cancer rates over recent decades are basically flat, cell phone use explosive. If cell phones increase the risk of brain cancer, it is a phenomenon that a lot of research has failed to clearly demonstrate, and if the effect is there, it is very small and entirely unexplained by physics or physiology.

Josh Harkinson discusses an important topic but does so that in a way is uncomfortably click-baity. (I assume this is in part the effect of the editors who chose the title and possibly the accompanying graphic). The title implies that science raises a concern (an alarm) and the article is accompanied by a doctored photograph of a woman using a cell phone; She is wincing as though suffering a health effect and red cell phone cancer-kooties are seeping into her head right there in the picture. The subtitle invokes the children: “Children in particular may be vulnerable.” And, the article begins with an appeal to the latant distrust, “Are government officials doing enough to protect us.”

The article stems from a letter signed by “195 scientists from 39 countries” who “have collectively published more than 2,000 peer-reviewed papers on the subject.” How many scientist deal with the topic of non-ionizing radiation (the kind of kootie stuff that emanates from your cell phone) interacting with tissue (what your head is made out of)? I’m not sure, but a Google Scholar search on the term “biological health effects non-ionizing radiation” yields over 14,000 results. There are probably tens of thousands of scientists who work in the general area of radiation-cell interaction. This is a huge and important area of research. Various kinds of radiation have health consequences. Radiation interacting with tissues is a widespread form of therapy and imaging (everything from x-rays to MRI). The properties of various kinds of radiation and the activity of molecules in cells is part of a lot of basic research in a lot of fields. Here in Minnesota, there are probably way over 200 scientists who routinely engage in research either about or relying on the basic physics and physiology of radiation-cell interaction. It is a big area, only some of which directly addresses health effects of non-ionizing radiation, but even that small percentage involves a lot of work, many research labs, a large number of scientists, and a lot of publications.

The letter and information about it can be found here. Watch the video. Note that the “scientists” are actually “scientist and engineers,” an unintended dog-whistle indicating the padding of consensus claim. The letter is not about people holding cell phones to their heads. It is about EMF in general (with a focus on cell phones), and suggests that the ambient EMF including power lines are the problem. This borders on Chemtrail like ideation. I strongly recommend you watch the video. Critically.

A letter with under 200 signers (across 39 countries) who claim to have published a couple of thousand papers on a topic is numerically weak. The reality and importance of anthropogenic global warming is a scientific consensus. Even so, climate science denialists have come up with lists and letters like this with much more impressive numbers, but thay amount to nothing. There are a lot of scientists out there. There are about seven million scientists. It is not hard to find a couple hundred who strongly believe something that many many more don’t accept as likely. Josh’s article does not address this context, and probably should.

That cell phones may cause cancer has been officially designated by the World Health Organization as “possible.” That sounds bad. But people need to understand, and Josh did not point this out, that the “possible” category includes anything where there is virtually any research indicating a possible link, even crappy research, and even if the research exists among a huge body of research that fails to indicate a link. There are many different categorizations of cancer risk, and different organizations maintain these definitions and lists. The International Agency for Research on Cancer, part of WHO, has these categories:

Group 1: Carcinogenic to humans
Group 2A: Probably carcinogenic to humans
Group 2B: Possibly carcinogenic to humans
Group 3: Unclassifiable as to carcinogenicity in humans
Group 4: Probably not carcinogenic to humans

Items in group one are really problems. They cause cancer and include such things as silica dust, Radon, Soot, Tobacco, and Thorium. Group 2A (Probable) is pretty long and includes a lot of nasty stuff with multi-syllabic names, as well as ultraviolet radiation. Being a hairdresser is a probable cause of cancer because of exposure to chemicals, as is working in a petroleum refinery, or being a shift worker involving changing time of work on a regular basis. These are things that we may want to worry about, but that people still argue about, but, as they say, probably are linked to cancer.

Group 2B, “possible,” the list cell phones are in, is very long, over 900 items, of which about a third are specifically considered possibly linked to human cancers (the others not linked to humans). This list also includes a lot of scary looking stuff, but for which there is insufficient research to actually make the link. Vinyl acetate is an example. It is a liquid precursor for a polymer used to make a lot of stuff. Wikipedia tells us, “On January 31, 2009, the Government of Canada’s final assessment concluded that exposure to vinyl acetate is not considered to be harmful to human health. This decision under the Canadian Environmental Protection Act (CEPA) was based on new information received during the public comment period, as well as more recent information from the risk assessment conducted by the European Union.” So that is an example of a scary sounding thing for which some research may have shown a cancer link but that was ultimately determined by at least one major agency to not be cancer causing. Potassium bromate. Used for a lot of things, it is in some of your food (baked goods mainly). It is banned in many countries, not in the US. In theory, it is broken down during baking. Coffee. Coffee has been some research indicating a link between coffee consumption and bladder cancer, but other studies show a reduced risk of intestinal cancer. Overall, the evidence for any of this is weak.

Group 2B listing is used when there is limited evidence of a cancer link and usually insufficient evidence for a cancer link in lab animals. Let’s put a finer point on it by looking at what the UN says about the 2A and 2B categories (emphasis added):

Group 2A: The agent is probably carcinogenic to humans.

This category is used when there is limited evidence of carcinogenicity in humans and sufficient evidence of carcinogenicity in experimental animals. In some cases, an agent may be classified in this category when there is inadequate evidence of carcinogenicity in humans and sufficient evidence of carcinogenicity in experimental animals and strong evidence that the carcinogenesis is mediated by a mechanism that also operates in humans. Exceptionally, an agent may be classified in this category solely on the basis of limited evidence of carcinogenicity in humans. An agent may be assigned to this category if it clearly belongs, based on mechanistic considerations, to a class of agents for which one or more members have been classified in Group 1 or Group 2A.

So, if you want to be careful, avoid Group 2A items. They may cause cancer, and you should worry about them.

Group 2B: The agent is possibly carcinogenic to humans.

This category is used for agents for which there is limited evidence of carcinogenicity in humans and less than sufficient evidence of carcinogenicity in experimental animals. It may also be used when there is inadequate evidence of carcinogenicity in humans but there is sufficient evidence of carcinogenicity in experimental animals. In some instances, an agent for which there is inadequate evidence of carcinogenicity in humans and less than sufficient evidence of carcinogenicity in experimental animals together with supporting evidence from mechanistic and other relevant data may be placed in this group. An agent may be classified in this category solely on the basis of strong evidence from mechanistic and other relevant data.

If you focus on the word “cancer” Group 2B may be scary to you, but many items on this long list are those for which we simply can not say there is no research project ever done that showed a possible link.

Josh notes that “For decades, some scientists have questioned the safety of EMF, but their concerns take on a heightened significance in the age of ubiquitous wifi routers, the Internet of Things, and the advent of wearable technologies like the Apple Watch and Fitbit devices, which remain in close contact with the body for extended periods.”

This points to a possibly unintended side effect of unnecessary concern over non-ionizing radiation. Non ionizing radiation is radiation that does not alter matter at the sub cellular level in a way that can lead to cancer or other negative effects. See this writeup for more detail on this important difference. All radiation reduces in its strength dramatically with distance. As your smart phone and your wi-fi router exchange information (when it is using that pathway to interact with the internet) the energy that comes out of the box across the room and the energy that comes out of the smart phone in your hand, at the point of, say, your nose (a proxy for your brain that allows us to discount the effects of your skin, skull, and dura matter reducing the signal) are many orders of magnitude different. Conflating concern over a cell phone pressed to your head with concern for wi-fi routers is like conflating concern over drowning in a pool with concern over drowning in the vapor that evaporated from the pool that give you that dank feeling as you sit nearby drinking your iced coffee drink form a polyvinyl acetate cup.

Except there really is a demonstrable risk of drowning in a pool.

This is a problem because there is a movement to remove wi-fi from all public spaces over health concerns. That is crazy talk. I wish Josh had noted that in his piece. The people who signed this letter are those same people … who want to remove wi-fi from your coffee shop.

A very very small number of researchers want to move cell phones from Group 2B to Group 2A, but even as they are asking for this, continued research on the cancer risk of cell phones a) fails to produce a mechanism by which this can happen despite a great deal of knowledge about radiation-tissue interaction and b) continues to show a possible link only in studies that are inherently flawed in their methodology. Such studies, mainly case-control studies, rely on people recalling their use of cell phones. People with brain cancer are asked to recall their cell phone use, and matched randomly chosen people without brain cancer are asked to do the same thing. (Not all studies are done just that way but key studies of relevance here were.) That is a great way to get a preliminary look at a possible health issue, but it is simply not how the actual connection between a substance, a technology or a behavior and a health effect is made.

We understand a lot about energy-tissue interaction. If non-ionizing radiation from cell phones caused cancer, we would have an inkling of the mechanism. We don’t. Cell phone use has exploded in recent decades, brain cancer has not. If cell phones caused brain cancer, it would be a visible epidemiological phenomenon. It is not.

I’ve been told (and some checking on the internet has indicated this is maybe important) that some of the material used to make cell phones comes naturally along with some radioactive isotopes. It is possible that these isotopes are not always removed properly. I do not know this is the case, but it is an interesting idea. A while back a shipment of cell phone cases that happened to be radioactive was located (and refused). Holding a radioactive cell phone case to your head several hours a day may be a health risk, though again, I don’t know this to be a fact, it probably depends on all sorts of things. The cell phone-cancer link is so weak that it may be a result of research bias, random effects, recall bias, or some effect related to the use of the cell phone but not to the non-ionizing radiation.

Smart phones are becoming so ubiquitous that they could almost be considered a key trait of our species. It is smart to be smart about smart phones. Worrying about the cancer link is probably not exactly stupid, but it isn’t particularly smart either.


This video addresses many of the topics I touch on here, and more:

Should I Wash My Dishes Before Putting Them In The Dishwasher?

As an anthropologist, I find the interface between technology and the larger culture in which it is embedded fascinating. You all know the old story of the family cook who habitually cuts the ends off the roast before slipping it in the oven. One day her child, hoping some day to be the family cook, asks why this is done. It turns out that nobody can remember, and the matter is dropped. But the question comes up again, at a later family dinner, this one attended by great grandma, who was the family cook a generation ago, and of course, she knows the answer.

“Back in the day,” she says, “It was the depression. We weren’t able to just go to the store and buy whatever we wanted, like people these days.”

Grandma always managed to work in a mention of how poor they were back in the depression. But this time it was relevant. “We had only one roasting pan,” she continued. “It was only 14 inches long and the roast was always a few inches longer. So I’d cut the ends off.”

And of course, ever since then, subsequent generations had learned to cut off the ends of the roast because that is how grandma did it, and there must have been some reason, though nobody knew what it was. And now, the roast, be-ended, sits small in the large stainless steel double handed Williams Sonoma roasting pan.

I think that is how some people load their dishwashers. Back in the day, dishwashers weren’t very good at washing dishes. They were really status symbols that did little more than rinse off the dishes that you’d already scraped and run under the faucet. You put dishes in the dishwasher that already looked pretty clean. The role of the dishwasher was to remove the few remaining cooties (or dog saliva for some households) and, if you kept up the supply of anti-spotting juice, to make sure that the glassware was shiny-clean.

Dishwashers have changed. A reasonably good dishwasher, not even the most expensive or fancy, does a much better job at washing dishes. Even cheap ones, probably. The difference in price between dishwashers is mostly a matter of bells and whistles and whether or not it has a stainless steel front, that sort of thing. Inside, the engineering of how to spray water on dishes from various angles for a very long period of time has been worked out. These days, you only need to remove the large parts, the parts that remain because people these days, unlike back in the depression when there was not enough food, have forgotten that they should finish the food on their plate. Even the chicken bones. Back in the depression, people ate the chicken bones.

When you wash dishes in the sink, you use water and energy. The energy is to heat the water, but also, the water itself requires energy to process and pump. When you wash dishes in the dishwasher, you use energy. Again, heating and getting water are factors, but also, the dishwasher has a pump and may have a water heating element, and of course, a drying element. More on the drying element later.

If you did a complete hand washing job on your dishes, then ran your dishes on a full cycle in the dishwasher, you would be using way more energy and water than required to actually get the dishes clean. But if you only hand wash the dishes a little — scrape the plates than run them under the water — maybe you are using less energy and water. But the fact remains, if you just scraped the dishes minimally and the put them in the dishwasher straight away, with absolutely no rinsing, you will use a minimal amount of energy.

Some people claim that they do hand washing so efficiently that they are using less energy than a dishwasher would ever use. Such folk eschew the dishwashing machine entirely. However, dishwasher experts claim that this is only rarely the case. The dishwasher uses a small percentage of the water and energy you use in hand washing.

Chis Mooney has written up the current research on dishwashing efficiency. His Washington Post article cites research from the EPA, the Natural Resources Defense Council, and the American Council for an Energy Efficient Economy. The bottom line: Don’t pre-rinse the dishes. Just put the damn dishes in the dishwasher. Oh, and you think your hand washing is efficient, do consider the possibility that you don’t really know that. You just think that because you want to. It is almost certainly the case that you can’t really prove that and it is likely (but not impossible) that it simply isn’t true. From Moony’s Washington Post article:

… dishwashers just keep needing less and less water (and energy) because of improving appliance standards, even as they get better and better at using it.

“While it may be possible to use less water/energy by washing dishes by hand, it is extremely unlikely,” Jonah Schein, technical coordinator for homes and buildings in the EPA’s WaterSense program, said…

“In order to wash the same amount of dishes that can fit in a single load of a full size dishwasher and use less water, you would need to be able to wash eight full place settings and still limit the total amount of time that the faucet was running to less than two minutes,” he said.

“…modern dishwashers can outperform all but the most frugal hand washers,” adds the American Council for an Energy-Efficient Economy.

This applies to modern Energy-Star rated dishwashers. Which, if your dishwasher is reasonably new, is probably your dishwasher. And by new, we mean up to several years old because this has been true for a long time. Moony’s story has further details on exactly what makes dishwashers more efficient.

So, this is like cutting the ends off the roast. In the old days, you needed to wash your dishes before you washed your dishes. Now, you can just wash your dishes. But do you? Or are you still cutting the ends off the roast?

(It is unfortunate for the dogs that they lose in both cases.)

Moony also talks about the drying element in dishwashers, and I have a word or two to say about that as well.

Consider the term “dishwasher safe.” In my household, everything is “dishwasher safe.” This is because I put everything in the dishwasher. If something is not dishwasher safe, it gets weeded out. Most things that are not dishwasher safe are subject to heat damage when the drying element comes on. I installed our present dishwasher about five years ago. The heating element has yet to come on. Well, it did by accident once and boy, did that smell bad. (If you don’t use the heating element, it tends to accumulate a layer of stuff that smells bad once you do turn it on). This is not to say that the only unsafe thing in a dishwasher, if you are a plate or a bowl or something, is the heating element. The water in a dishwasher is hot, and the chemicals are caustic. We have a number of coffee mugs that no longer say what they formerly said because the cheap printing process used to make them did not stand up to the slings and arrows of outrageous technology. Those coffee mugs that change on the outside when you put hot coffee in them? That works because of a layer of cheap plastic on the outside of the cup. My Doctor Who mug (where the Tardis disappears and reappears) lasted one day. I still have it but it is a simple black mug with no evidence that the Doctor ever existed. And, when I pop in “clean recyclables” like a peanut butter jar made of plastic, that stuff comes out distorted and half melted, but not really melted and it isn’t a problem; It was on the way to the recycling bin anyway.

If you never turn on your heating element you will use a lot less electricity and many non-dishwasher safe items survive the dishwasher. I’m not making any promises, I’m just telling you what I do. Don’t worry, the dishes get dry. Modern dishwashers run some air through after the washing is finished on a full cycle, and if you open the door, physics, in the form of evaporation, will work very well.

This, of course, is a metaphor for many other things. Consider the culture of your use of technology. Do you let your car warm up for a long time on a cold winter morning? To you leave it running when not actually driving because you heard it takes more energy to start it than to run it for a while? Do you leave florescent lights on in the office all day even when the rooms are empty because you heard that was more efficient? As usual, you are probably doing it all wrong. Not your fault, it is just how our brains, and our cultures, work. But you can change and help make a difference.

Ana is coming up the Atlantic

I remember joking with my friend Ana about how her name would be attached to the first named storm in the 2015 Atlantic Hurricane season. It turns out Ana is an exceptional individual. Both of them.

Ana Miller as Aisha Lefu in "The Recompense: A Star Wars Fan Film."
Ana Miller as Aisha Lefu in “The Recompense: A Star Wars Fan Film.”
Ana, my friend, is an actor and is currently engaged in a project I’ll be telling you more about later. But in the meantime, you can visit this page and find out about a new and very interesting Star Wars related crowd-funded production called The Recompense. Give them money.

Meanwhile, back in the Atlantic Ocean, Tropical Storm Ana has formed, nearly three weeks before the official start of the Atlantic Hurricane Season. A few days ago Ana was a disorganized disturbance (I’m talking about the storm here) and now Ana is a full on tropical storm tracking the very warm Gulf Stream. Winds are steady at 60 miles per hour, gusting to 70.

From the National Weather Service:

Deep convection has increased somewhat near the center of the storm, and SFMR observations from the Air Force Hurricane Hunters continue to support an intensity of 50 kt. Ana will be moving over the cooler waters to the northwest of the Gulf Stream later today, and water vapor imagery shows a belt of upper-level northerly flow advancing toward the tropical cyclone. The decreasing sea surface temperatures and increasing northerly shear should cause Ana to weaken as it nears the coast. The official intensity forecast is similar to that from the previous package, and very close to the latest intensity model consensus, IVCN.

Tropical Storm Ana, the first storm of the Atlantic Hurricane Season.
Tropical Storm Ana, the first storm of the Atlantic Hurricane Season.
The initial motion estimate is 320/3. The track forecast reasoning remains basically unchanged from the past few advisories. Global models continue to predict that the blocking mid-level ridge to the north of Ana will shift eastward and weaken over the next couple of days. These models also show a broad trough moving from the central to the eastern U.S. over the next 72 hours or so. This should result in the cyclone turning northward and north-northeastward with a gradual increase in forward speed. The official track forecast is similar to the previous one and in good agreement with the latest dynamical model consensus, TVCN.

Hey, good news, the NWS is implementing the long-ago announced policy of GETTING RID OF ALL CAPS!!1!! Meanwhile, Ana the Storm is expected to strike the coast of South Carolina, and/or North Carolina, tonight. The storm, once over land, will turn northeast and make its way back out to sea off Delmarva, and eventually menace, a little, southern New England. The middle of the storm will probably be crossing the Carolina coast about 8:00 AM Sunday, and what is left of it will be re-joining the coast and the Atlantic early Monday.

Heretofore unknown deadly eddies flow across the Atlantic Ocean

Sometimes science sees something change – there is more of something, or less, or more importantly, there is a change in the rate of some phenomenon or in its pattern of variability. But sometimes science looks out there in the world and observes something that was probably there all along (though there may be changes in the past or future) but it just wasn’t noticed before.

There is a new study that describes and documents such a phenomenon. The thing we are talking about is over 100 kilometers across, several meters thick, moves at about 4.5 km/h, exists just below the surface of the ocean, can potentially be detected in satellite data as a very small change in sea level, there are probably a lot of them, and they are continuously forming, moving across certain parts of the sea, and disappearing.

Oh, and the are Neptunic Grim Reapers. They are potentially fatal to any animal that ends up inside them, and they may explain previously observed mass die-offs around oceanic islands.

Is this some kind of fish? A red tide gone amok? Glowing blobs of nuclear waste spreading from Fukushima? No, but they do remind me of a 1960s era but still extant musical group that frequently toured with Frank Zappa.

The phenomena is a dead zone that forms inside an eddy, as the eddy flows across the ocean’s surface. An eddy is a vortex with a non linear pattern of kinetic energy, which causes properties that might normally transcend the water in and near the eddy to be broken in to discrete areas. The eddy can have a very rich biota near its center, and that includes animals that respirate much of the oxygen into CO2, but fresh dissolved Oxygen does not transfer across the outer boundary of the eddy. When this happens, the eddy loses most of its dissolved Oxygen and you get a big flat round traveling dead zone. The dead zone isn’t very thick, and the ocean layer above it has O2 because it is directly interacting with the atmosphere. But since the eddy is caused by living breathing organisms, it obviously exists in a 3 dimensional space where living breathing organisms (mostly plankton) would normally be. From the abstract of Open ocean dead zones in the tropical North Atlantic Ocean by Karstensen, Fiedler, Schütte, Brandt, Körtzinger, Fischer, Zantopp, Hahn, Visbeck, and Wallace:

Here we present first observations, from instrumentation installed on moorings and a float, of unexpectedly low … oxygen environments in the open waters of the tropical North Atlantic … The low-oxygen zones are created at shallow depth, just below the mixed layer, in the euphotic zone of cyclonic eddies and anticyclonic-modewater eddies. Both types of eddies are prone to high surface productivity. Net respiration rates for the eddies are found to be 3 to 5 times higher when compared with surrounding waters. Oxygen is lowest in the centre of the eddies, in a depth range where the swirl velocity, defining the transition between eddy and surroundings, has its maximum. It is assumed that the strong velocity at the outer rim of the eddies hampers the transport of properties across the eddies boundary and as such isolates their cores. This is supported by a remarkably stable hydrographic structure of the eddies core over periods of several months. The eddies propagate westward … from … the West African coast into the open ocean. High productivity and accompanying respiration, paired with sluggish exchange across the eddy boundary, create the “dead zone” inside the eddies, so far only reported for coastal areas or lakes. We observe a direct impact of the open ocean dead zones on the marine ecosystem as such that the diurnal vertical migration of zooplankton is suppressed inside the eddies.

“The simple appearance of dead-zones in an ocean that typically has relative high oxygen concentrations is a local “extreme environment” – and extreme environments can always help us to better understand how for example ecosystems have reacted in the past, or how quickly they can develop strategies to adapt,” lead author Johannes Kartensen told me. “This is to us an exciting question for future research.”

I asked Kartensen why these eddies had not been discovered before. He said, “I think that open ocean dead-zones have been “overlooked” in the past.” He showed me a map of “all historical oxygen observation at 50 m depth (core of the dead-zone we observe) available in the year 2005 (but dating back to the early 1900). Large areas can be seen that virtually have no (or maybe 1) observation – easy to miss a dead-zone eddy with 100km diameter.” Here’s the map:

o_2_0_5

There are an awful lot of dots on this map, but the spaces between them are huge. He added, “in the past oxygen was measured only a discrete depth and maybe at 24 points over a water depth of 4000m – say one data point every 200m – such a vertical resolution makes interpretation of the data challenging. Extreme anomalies resolved with one data point only have often be disregarded, as an ‘outlier.’ In fact, we also disregarded the data from the first observation in 2007 as an outlier and only after repeating observations considered the data to be real.” He also noted that this also happened with the eddies themselves. “before the satellite era it was not expected that the ocean is populated by eddies and that the ocean is such a “turbulent” place.”

Now, what about the death thing? I would like to know, ultimately, how much organic material is contributed to deeper water components of the carbon cycle, or deposited in the deep ocean, by these eddies ocean wide. Clearly more research has to be done to establish the distribution of the dead zone eddies, as well as variation in their occurrence across space and time. Meanwhile, there is the current but seemingly low-probability thread of a deadly eddy flowing into the near shore environments of oceanic islands. Since they are over 100 km long and move at a rate of a few km a day, such a dead zone could have a huge impact on the local marine ecology. From the paper:

Eddies were observed less than 100 km north of the Cabo Verde islands; thus a possible interaction of a dead zone eddy with an island must be considered. Given the shallow depth of a few tens of metres where lowest [dissolved Oxygen] concentrations are found, a sudden flooding of a coastal areas with [low oxygen] waters may occur. A dramatic impact on the local ecosystems and sudden fish or crustacean death may be the consequence. In retrospect, such eddy–island interactions may explain events that have been reported in the past.

I look forward to more research into these eddies. Meanwhile, this should keep us entertained:

Bjorn Lomborg Pulled From Australian Consensus Centre Project

Over the last several weeks we’ve seen the University of Western Australia accept a $4 million dollar Federal grant to develop a “Consensus Centre” in the mold of Bjorn Lomborg’s non profit, with Lomborg as a key player. Lomborg has been heavily criticized for his lack of scholarship and seemingly biased policy related to climate change and related issues. There was heavy opposition in the Australian academic community to this project. Under pressure from peers and colleagues, Vice Chancellor Paul Johnson has announced that the project is cancelled. In a message published on the UWA University News site, the Vice-Chancellor notes:

The mail drop in Massachusetts which apparently fronts for Lomborg's Copenhagen Consensus Institute.
The mail drop in Massachusetts which apparently fronts for Lomborg’s Copenhagen Consensus Institute.

… it is with great regret and disappointment that I have formed the view that the events of the past few weeks places the Centre in an untenable position as it lacks the support needed across the University and the broader academic community to meet its contractual obligations and deliver value for money for Australian taxpayers.

I have today spoken to the Federal Government and Bjorn Lomborg advising them of the barriers that currently exist to the creation of the Centre and the University’s decision to cancel the contract and return the money to the government.

More here: University of Western Australia Cancels $4 Million Federal Government Contract For Bjorn Lomborg’s Consensus Center

The #FauxPause is Faux

The Earth’s climate is warming. The upper oceans are warming, the sea surface temperatures are elevated, the air in the lower Troposphere, where we live, is warming. This warming is caused almost entirely by the increase in human generated greenhouse gasses and the positive (not positive in a good way) feedbacks caused by that. The effects that increase the global heat imbalance and the effects that decrease it (such as greenhouse gas increase and aerosols — dust — from volcanoes, respectively) vary over time in their effect, which causes some variation in the upward march of global surface and ocean temperatures. Meanwhile heat is going back and forth between the surface (atmosphere and sea surface) and the deeper (but still upper) ocean, which causes either of these parts of the Earth system to wiggle up and down in temperature. There are other effects causing other wiggles.

See: The Ocean Is The Dog. Atmospheric Temperature Is The Tail

Every now and then the wiggling results in a seemingly rapid increase in temperature. Every now and then the wiggling results in a seeming slowdown in increase in temperature. When the latter happen, those who wish you to believe that climate change is not real point, jumping up and down, giddy, and say, “See, there is a pause in global warming, therefore global warming is not happening!” They are wrong for the reasons just stated.

Dana Nuccitelli (author of this book), over at the Guardian, has written a blog post that nicely summarizes current scientific thinking on the so-called pause (or so-called hiatus) which has been named by witty climate scientists as the #FauxPause. Because it is faux.

Screen Shot 2015-05-06 at 10.42.10 AM

The post is here, go read it: Pause needed in global warming optimism, new research shows

I’m pretty sure Dana and I were looking at the same information recently. While Dana focuses on the #FauxPause, I recently penned this: Global Warming Getting Worse

In Alberta, Pigs Do Fly

The Canadian Province of Alberta has been likened to the American State of Texas. Energy and cattle, energy barons and cowboys. But with mountains.

Yesterday a relatively liberal party, the New Democratic Party (NDP), won a surprise victory in the provincial election, ousting the 44 year long reign of the Progressive Conservatives. From an American point of view, this is all very confusing because the Canadian political system is very different. Alberta has a Premier, and the premier will step down because of this election. The NDP formerly never held very many seats in the legislature, but now holds 55 out of 87, with the Progressive Conservatives ending up with an anemic 11.

This is relevant to topics often discussed here because Alberta is where the famous Canadian Tar Sands, the bitumen from which would be carried on the famous Keystone XL Pipeline through the United States to points unknown, rest. This raises two questions. First, did the left-leaning victory arise in part (small or large) from the fight over tar sands exploitation? Second, will this change in government influence the future exploitation of this relatively dirty source of Carbon-based fossil fuel?

People vote for a range of reasons. When a large and unexpected shift happens, in American politics, it is more often than not (IMHO) because voters are upset with those in power, and are “throwing the bums out.” I think it is much more rare to see a smaller coalition blossom into a majority over issues pushed by that coalition. Also, even though the NDP is left leaning, just how “left” (meaning, in the context of these major issues, Climate Hawkish) are they? All you Canadian Politics experts need to provide your analysis in the comments below. I’m especially interested in John Irving’s analysis. (John?)

It is said that this is like a Democratic sweep/Republican trounce in Texas. Is it? Will it last? Is this a game-changer, a sea change? Some other appropriate Canadian metaphor? (Ice-out? Turning of the maple leaf?)