Monthly Archives: February 2016

Republican Donors Might Run A Third Party Candidate

They even have a short list of candidates. Unfortunately, the only available copy of the secret internal report on running a third party candidate has the list blacked out (see above).

According to Scott Bland at Politico:

Conservative donors have engaged a major GOP consulting firm in Florida to research the feasibility of mounting a late, independent run for president amid growing fears that Donald Trump could win the Republican nomination.

“All this research has to happen before March 16, when inevitably Trump is the nominee, so that we have a plan in place,” a source familiar with the discussions said. March 16 is the day after the GOP primary in Florida…

The document, stamped “confidential,” was authored by staff at Data Targeting, a Republican firm based in Gainesville, Fla. The memo notes that “it is possible to mount an independent candidacy but [it] will require immediate action on the part of this core of key funding and strategic players.”

This, of course, would guarantee that both the Republican candidate, probably Trump, and the independent candidate, would lose. So, it is a kind of apoptosis.

Here’s the thing. Why would they do this? Why would the Republican Party build up a power base linked to a philosophy, then, when the ultimate candidate emerges, who represents that philosophy of hate and fascism comes to fore, bail?

One possibility is that the importance of corporate control of the President is the central guiding force for strategy. After all, we are talking about unspecified “donors.” Those donors are not concerned with the political philosophy of the candidate, just that the candidate be controlled.

It is interesting to compare this effect across the two parties. One could say that Clinton is more the corporate candidate and Sanders is not. But, Democrats are not actually (despite pernicious rumors the contrary) destroying sanders or planning to put him down. A lot of Democrats, including many in power, like Sanders. But when an insurgency candidate (which, it seems, is defined as not, or less, bought and paid for) comes along in the Republican party, the Programmed Party Death Button is seriously considered. The parties really are not the same.

I wouldn’t expect anything to come to this if it is a real effort to get a particular candidate to win. But if this really is an effort by the Republicans to put themselves down, then the chances of a third party run may be much higher, because it doesn’t have to work. It just has to break everything.

Whom Should I Vote For: Clinton or Sanders?

You may be asking yourself the same question, especially if, like me, you vote on Tuesday, March 1st.

For some of us, a related question is which of the two is likely to win the nomination.

If one of the two is highly likely to win the nomination, then it may be smart to vote for that candidate in order to add to the momentum effect and, frankly, to end the internecine fighting and eating of young within the party sooner. If, however, one of the two is only somewhat likely to win the nomination, and your preference is for the one slightly more likely to lose, then you better vote for the projected loser so they become the winner!

National polls of who is ahead have been unreliable, and also, relying on those polls obviates the democratic process, so they should be considered but not used to drive one’s choice. However, a number of primaries have already happened, so there is some information from those contests to help estimate what might happen in the future. On the other hand, there have been only a few primaries so far. Making a choice based wholly or in part on who is likely to win is better left until after Super Tuesday, when there will be more data. But, circling back to the original question, that does not help those of us voting in two days, does it?

Let’s look at the primaries so far.

Overall, Sanders has done better than polls might have suggested weeks before the primaries started. This tell us that his insurgency is valid and should be paid attention to.

There has been a lot of talk about which candidate is electable vs. not, and about theoretical match-ups with Trump or other GOP candidates. If you look at ALL the match-ups, instead one cherry picked match-up the supporter of one or the other candidate might pick, both candidates do OK against the GOP. Also, such early theoretical match-ups are probably very unreliable. So, best to ignore them.

Iowa told us that the two candidates are roughly matched.

New Hampshire confirmed that the two candidates are roughly matched, given that Sanders has a partial “favorite son” effect going in the Granite State.

Nevada confirmed, again, that the two candidates are roughly matched, because the difference wasn’t great between the two.

So far, given those three races, in combination with exit polls, we can surmise that among White voters, the two candidates are roughly matched, but with Sanders doing better with younger voters, and Clinton doing better with older voters.

The good news for Sanders about younger voters is that he is bringing people into the process, which means more voters, and that is good. The bad news is two part: 1) Younger voters are unreliable. They were supposed to elect Kerry, but never showed up, for example; and 2) Some (a small number, I hope) of Sanders’ younger voters claim that they will abandon the race, or the Democrats, if their candidate does not win, write in Sanders, vote for Trump, or some other idiotic thing. So, if Clinton ends up being the nominee, thanks Bernie, but really, no thanks.

Then came South Carolina. Before South Carolina, we knew that there were two likely outcomes down the road starting with this first southern state. One is that expectations surrounding Clinton’s campaign would be confirmed, and she would do about 70-30 among African American voters, which in the end would give her a likely win in the primary. The other possibility is that Sanders would close this ethnic gap, which, given his support among men and white voters, could allow him to win the primary.

What happened in South Carolina is that Clinton did way better than even those optimistic predictions suggested. This is not good for Sanders.

Some have claimed that South Carolina was an aberration. But, that claim is being made only by Sanders supporters, and only after the fact. Also, the claim is largely bogus because it suggests that somehow Democratic and especially African American Democratic voters are somehow conservative southern yahoos, and that is why they voted so heavily in favor of Clinton. But really, there is no reason to suggest that Democratic African American voters aren’t reasonably well represented by South Carolina.

In addition to that, polling for other southern states conforms pretty closely to expectations based on the actual results for South Carolina.

I developed an ethnic-based model for the Democratic primary (see this for an earlier version). The idea of the model is simple. Most of the variation we will ultimately observe among the states in voting patterns for the two candidates will be explained by the ethnic mix in each state. This is certainly an oversimplification, but has a good chance of working given that before breaking out voters by ethnicity, we are subsetting them by party affiliation. So this is not how White, Black and Hispanic people will vote across the states, but rather, how White, Black and Hispanic Democrats will vote across the state. I’m pretty confident that this is a useful model.

My model has two versions (chosen by me, there could be many other versions), one giving Sanders’ strategy a nod by having him do 10% better among white voters, but only 60-40 among non-white voters. The Clinton-favored strategy gives Clinton 50-50 among white voters, and a strong advantage among African American voters, based on South Carolina’s results and polling, of 86-14%. Clinton also has a small advantage among Hispanic voters (based mainly on polls) with a 57:43% mix.

These are the numbers I’ve settled on today, after South Carolina. But, I will adjust these numbers after Super Tuesday, and at that point, I’ll have some real confidence in the model. But, at the moment, the model seems to be potentially useful, and I’ll be happy to tell you why.

First, let us dispose of some of the circular logic. Given both polls and South Carolina’s results, the model, based partly on South Carolina, predicts South Carolina pretty well using the Clinton-favored version (not the Sanders-favored version), with a predicted cf. actual outcome of 34:19% cf 39:14% This is obviously not an independent prediction, but rather a calibration. The Sanders-favored model predicts an even outcome of 27:26%.

The following table shows the likely results for the Clinton-favored and Sanders-favored model in each state having a primary on Tuesday.
Screen Shot 2016-02-28 at 12.50.21 PM
The two columns on the right are estimates from polling where available. This is highly variable in quality and should be used cautiously. I highlighted the Clinton- or Sanders-favored model that most closely matches the polling. The matches are generally very close. This strongly suggests that the Clinton-favored version of the model essentially works, even given the limited information, and simplicity of the model.

Please note that in both the Clinton- and Sanders-favored model, Clinton wins the day on Tuesday, but only barely for the Sanders-favored model (note that territories are not considered here).

I applied the same model over the entire primary season (states only) to produce two graphs, shown below.

The Clinton-favored model has Clinton pulling ahead in committed delegate (I ignore Super Delegates, who are not committed) on Tuesday, then widens her lead over time, winning handily. The Sanders-favored model projects a horserace, where the two candidates are ridiculously close for the entire election.


So, who am I going to voter for?

I like both candidates. The current model suggests I should vote for Clinton because she is going to pull ahead, and it is better to vote for the likely winner, since I like them both, so that person gets more momentum (a tiny fraction of momentum, given one vote, but still…). On the other hand, a Sanders insurgency would be revolutionary and change the world in interesting ways, and for that to happen, Sanders needs as many votes on Tuesday as possible.

It is quite possible, then, that I’ll vote for Sanders, then work hard for Hillary if Super Tuesday confirms the Clinton favored model. That is how I am leaning now, having made that decision while typing the first few words of this very paragraph.

Or I could change my mind.

Either way, I want to see people stop being so mean to the candidate they are not supporting. That is only going to hurt, and be a regretful decision, if your candidate is not the chosen one. Also, you are annoying the heck out of everyone else. So just stop, OK?

Current Status of California Drought, and other matters: Interview with Peter Gleick

The latest episode of Ikonokast, the science podcast Mike Haubrich and I do, is now up. This is an interview with Pacific Institute’s Peter Gleick. We talk about the California drought (past, present, and future), Syria, virtual water, El Nino and climate science denialism.


Are we witnessing an Arctic Sea meltdown, right now?

The Arctic Sea freezes over. The Arctic Sea melts. This happens every year. The average date for the maximum extent of Arctic Sea ice, based on a period of 1981-2010, is March 12. The minimum extent is reached, on average, about September 15h.

Every year for the last several years, the minimum ice has been much lower than average in extent, and many years in a row have seen record minima. This is considered to be the result of global surface warming caused by human release of greenhouse gas pollution.

It is said that we can’t use the maximum ice cover to predict the minimum ice cover very accurately, because a lot of things can happen to affect the total ice cover during those many months of melting. However, the maximum ice amount for, say, 1979-1988 (the first ten years for which we have really good data on this) was high compared to the last ten year period, and correspondingly, the minimum extent was greater for that first ten year period than the most recent ten years, so there is a correlation. Still, the date of the maximum extent has tended to not move around much, and the same is true for the date of the minimum extent.

Bt maybe not this year. This year’s maximum Arctic Sea ice extent seems to have flatlined at a record low value, as shown in the graph above, from here. The current sea ice extent is that red line all by itself down near the bottom.

It may well be the case that the sea ice will start to re-freeze, and this line will go up again over the next two weeks or so, and max out near the historical average. The next week or so should be below freezing across much of the Arctic Sea, but there is a warm intrusion near Greenland and Europe, with above freezing air, expected to persist for that entire time. Overall, warm air and ice-breaking-up storms have invaded the Arctic repeatedly this winter. The sea ice extent may recover over the next several days, but I get the impression that most experts are quietly thinking it won’t.

This is not terribly surprising, given that the Earth’s surface temperatures are increasing, and sea ice is decreasing. This year, a El Ninño is adding fuel to the fire, as it were, and making these conditions even more extreme.

A concerning possible outcome is this: The Arctic Sea ice helps cool the planet by reflecting away sunlight. It is a reasonable assumption that during summers with much less ice, there is much less cooling. This can have impacts on the longer-lived fast ice* that is also melting in the arctic, and on nearby glaciers in greenland, and the planet overall. This is what is known as a “positive feedback” which is a somewhat misleading term, because this is not an especially “positive” event.

*CORRECTION: My friend and colleague Tenney Naumer, who watches both the weather and the Arctic very closely, contacted me to let me know that the “fast ice” is long gone. She told me, “In 2012, the ice in the channel between Ellesmere and Axel Heiberg Island (to the south of Ellesmere) melted out — that is the place where the ice had existed for more than 10,000 years. The Ward Hunt Ice Shelf broke off in 2002. The Ayles Ice Shelf broke off in 2005. In 2007, I watched (here) the ice break away from most of the Arctic side of the archipelago, and it has been all downhill since. It’s all gone now.”

The chilling effect of concealed carry law on the Texas classroom

Texas has adopted a law that allows students to bring a handgun to class, or to meetings with professors.

As a response to this policy, the president of the Faculty Senate, Jonathan Snow, gathered a group of faculty and gave a powerpoint presentation that included the slide at the top of the post.

Snow’s presentation was not any sort of official university statement, but the slide does a good job of demonstrating the likely effect on faculty student relationships under the conditions where students are more likely to pull out a handgun and plug the professor.

The situation, and the context for this presentation, are written up in this post at the Chronicle of Higher Education. PZ Myers discusses it here. The powerpoint presentation is available here.

A suitable response faculty may consider is here.

Who Will Win The Next Several Primaries: Clinton or Sanders?

I recently developed a model of how the primary race will play out between Democratic presidential hopefuls Hillary Clinton and Bernie Sanders.

That model made certain assumptions, and allowed me to produce two projections (well, many, but I picked two) depending on how each candidate actually fairs with different ethnic groups (White, Back, Hispanic, since those are the groupings typically used).

The two different versions of this model were designed to favor each candidate differently. The Clinton-favored model started with the basic assumption that among white Democratic Party voters, both candidates are similar, and that Clinton has a strong lead among Hispanic voters and an even stronger lead among African American voters. The Sanders-favored model assumes that Sanders has a stronger position among White voters and less of a disadvantage among non-White voters.

The logic behind the equivalence among White voters is that this his how the two candidates did in Iowa, which is a representative of the United States White vote, unadulterated by a favorite son effect in New Hampshire. Nevada failed to indicate that this assumption should be changed.

The favoring of Clinton among non-White voters is based on national polling with respect to ethnic effects. The logic behind the Sanders-favored version is that Sanders’ strategy, to win, has to involve a large young, white, male turnout (evidenced in the polls) and a narrowing of the gap among African American and Hispanic voters.

In that model, presented here, I used statewide demographic data to establish the ethnic term. However, that is incorrect, because one’s chances of engaging in the Republican vs. Democratic process in one’s state is tied to ethnicity. More Whites are Republicans, more Blacks are Democrats. I knew that at the time I worked out the model, but sloth and laziness, combined with lack of time, caused me to simplify.

The newer version of the model adjusts for likely Democratic Party membership. The results are the same but less dramatic, with a much longer slog to the finish line and the two candidates doing about the same as each other for the entire primary season.

The outcome of my modeling (reflected in the non-adjusted and adjusted versions, each with a Clinton- and Sanders-favored version) is different from the expectations of either campaign, as far as I can tell. Clinton boosters are claiming that the Democratic Party is mainly behind her, and these first primaries are aberrant. Sanders boosters are claiming the Sanders strategy of having a surge of support will carry him to victory. Both of these characterizations require that each candidate surge ahead pretty soon, and don’t look back. The opportunity to surge ahead is, certainly, Super Tuesday (March 1st).

The models I produced, with the assumptions listed above, show a close race all along, so either the campaigns are wrong or I am wrong.

The graphic at the top of the post represents how far ahead each candidate will be across the primary season, for each of their respective favored strategies.

So for Clinton, the ethnic gap is maintained as wide, and the blue line shows that she will surge nearly 40 committed delegates ahead of Sanders (a modest surge) and continue to develop a wider and wider gap past mid-March, and thereafter, maintain but not increase that gap, of about 80 committed delegates, until the end.

For Sanders, the orange line, the initial gap formed on Super Tuesday, does not start out very large, but his gap steadily increases until the end of the primary season, ending with a gap of over 120 committed delegates.

So, that is the new model. But, it is a bogus model.

I’m trying to stick with empirical data that do not rely on polling. Why? Because everybody else is relying on polling, and this is an election season where the polling is not doing a good job of predicting outcomes. Also, my modeling gives credit to each campaign’s claims, which is at least interesting, if not valid, as a way of approaching this problem. If Clinton is right, she wins this way. If Sanders is right, he wins that way.

However, the data are insufficient to have much faith in this model. Super Tuesday will provide a lot more information, and with that information I can rework the model and have some confidence in it.

Who will win the South Carolina Primary, Clinton or Sanders?

While working this out, I naturally came up with predictions for what will happen in all of the future primaries. So let’s look at some of that.

In South Carolina, according to my model, if Clinton’s strategy holds, she will win 29 delegates, and Sanders will win 24 delegates. If the Sanders strategy pertains, they will tie, or possibly, Clinton will win one more delegate than Sanders.

Who will win the Super Tuesday primaries?

The following table shows the results predicted by this model, for both the Clinton-favored and Sanders-favored versions, for all the Super Tuesday state primaries or caucuses.


The Clinton-favored model suggests that Clinton will win six out of 11 primaries, and take the majority of uncommitted delegates. The Sanders-favored model suggests that Sanders will take 9 out of 11 primaries, and win the majority of uncommitted delegates.

Notice that I put Vermont in Italics, because Sanders is likely to win big in Vermont no matter what happens. This underscores the nature of this model in an important way. I’m not using any data from the actual states, other than the ethnic mix from census data, with an adjustment applied to produce an estimate of Democratic Party membership across ethnic groups. That estimate is based on national data as well as data specifically form Virginia, to provide some empirical basis.

I suspect most people will have two responses to this table. First, they will say that a model that incorporates Clinton’s strategic expectations should have her winning more. Second, they will say that all the numbers, for all states and all models, are too close.

These are both legitimate complaints about my model, and will explain why it will turn out to be totally wrong. Or, they are suppositions people are making that are totally wrong, and when my model turns out to be uncannily accurate, those suppositions will have to be put aside for the rest of the primary season. (Or, some other outcome happens.)

I will restate this: I’m looking for Super Tuesday to provide the best empirical data to make this model work for the rest of the primary season. But, in the meantime, this seemed like an interesting result to let you know about.

What is the “pause” in global warming?

A new paper (commentary) on the so-called “pause in global warming” puts it all together.

First let’s establish this as a starting point. When climate science contrarians refer to a “pause” or “hiatus” in global warming, they usually mean that the process of warming of the Earth’s surface caused by the human release of greenhouse gas is not a thing. They are usually implying, or overtly claiming, that the link between CO2 and other greenhouse gas pollutants and surface warming was never there to begin with, and previous warming, warming before “the pause,” was natural variation. Many even go so far as to claim that the Earth’s surface temperature will go down to levels seen decades ago.

“The Pause” is not, in their minds, a slowdown in the rate of warming. It is a disconnect, either there all along or produced somehow recently, between the physics of the greenhouse effect and reality.

Also, the evidence adduced for this “pause” is often bogus. Sometimes badly calibrated satellite data is used to show a flat Earth surface. Er, flat Earth surface temperature. Sometimes a line is drawn from an unusually warm, even under conditions of global warming, El Niño year, to later years, lately excluding record breaking recent warm years, in order to make the line flatter. So, that part of the denier pause is a different kind of lie. See this post and this post for recent research on this issue.

Having said all that, there have been frequent slowdowns and speed-ups in the rate of the planet’s surface warming throughout the entire instrumental record. (Even though the instrumental record begins in 1850 or 1880, depending on which data set you use, greenhouse gas pollution started before that, so some greenhouse warming has been happening all along).

Prior some date, like 1970 perhaps, the up and down variations in surface temperature has been a combination of natural variation and human caused variation, with both being strong factors. The human caused variation includes particulate pollution (from burning coal, mainly) which pushes the temperature down, and greenhouse gas release and its associated effects, which push the temperature up.

For the last third or so of the 20th century, through the present, while both natural and human-caused effects matter, the role of human effects has increased to be the dominant force in the overall trend. The natural variations continue to contribute to the shape of the curve, but this contribution is attenuated by the increased abundance of human generated greenhouse gas.

For the last few years, we have seen several research projects that look at the “pause.” Many of these projects helped to explain the slowdown by showing that it wasn’t really as much of a slowdown as previously thought. For example, some research showed that the surface warming in recent decades has been under-measured because the Arctic (and probably the interior of Africa) were getting warmer faster, compared to other regions, and those areas were under-sampled by the usual data sets. Also, heat has been moving in and out of the ocean all along, and that has had an effect on the surface temperaturews.

But even after accounting for all of these effects, there is still a slowdown.

John C. Fyfe, Gerald A. Meehl, Matthew H. England, Michael E. Mann, Benjamin D. Santer, Gregory M. Flato, Ed Hawkins, Nathan P. Gillett, Shang-Ping Xie, Yu Kosaka and Neil C. Swart have just published a commentary in Nature Climate Change called “Making sense of the early-2000s warming slowdown” that looks at what caused this partial flattening out of the upward trend in global surface temperatures.

Part of this investigation compares the earlier part of the 20th century, when there was a much more significant slowdown in warming, with the more recent slowdown. Fyfe et al note that there are two major contributors to variation in surface temperature aside from greenhouse gases. One is the abundance of aerosols, such as industrial pollution (more of a factor during the earlier hiatus) and the output of volcanoes (such as the 1991 eruption of Mount Pinatubo). The other is multi-decade scale variation in the interaction between the oceans and the atmosphere. The following figure compares the two periods of reduced rate of warming.

Screen Shot 2016-02-23 at 1.22.39 PM

As noted in the caption, the two periods are representations of how far off from expected (based on simple greenhouse warming) each period is. It happens that these two periods of slowdown in rate of warming are associated with the negative phase of a major ocean-atmosphere interaction, during which the ocean was eating up some of the extra heat, removing it from the atmosphere. The intervening period of increased rate of warming (from the mid 1970s to about 2000) is associated with a period when this system was in positive phase, putting heat out into the atmosphere. As I’ve noted before, the ocean, which takes up most of the global warming caused heat, is the dog, and the atmosphere is the tail. This is a graph of a dog wagging its tail.

It is not clear when the multi-decade scale ocean-atmosphere interactions will shift to a positive phase. If you look at just the raw numbers, it seems like this may have started a few years ago (around late 2013) but the index for this phenomenon varies enough (goes positive and negative and back over shorter time periods, briefly) that this is not certain. More recently, we have an El Nino causing the belching of heat form the ocean to the air, heating up the surface. This may or may not be related to the multi-decade pattern. Having said all that, we may be concerned that over the next ten years or so, starting about now, we will be in a positive phase during which the rate of warming will be accelerated. This may not be the case. Or it might be the case. No one is actually betting on this yet.

New Research On The Rising Sea

Human caused greenhouse gas pollution has locked us into a situation where the global sea level will rise, at an unknown rate, high enough to inundate most major coastal cities and vast areas of agricultural land in low lying countries, and wipe out thousands of islands. Entire countries (small, low lying ones, and pacific ocean nations) will either disappear entirely or be made very small. Even as we head towards a likely limit in global food production in relation to increasing demand, large productive agricultural areas will be destroyed. As far as I can tell, there is nothing to stop this from happening, though reducing our greenhouse gas pollution to zero over the next several decades may prevent the global ocean from rising to its absolutely maximum amount.

So sea level rise is important.

The surface of the Earth comes in two forms: Ocean bottom and continent. They are totally different geologically, with the ocean bottom consisting of relatively heavy basaltic rock formed at the margins between spreading plates, and continents of lighter rock, generally formed from below.

The global ocean sits mainly on the oceanic plates, but at its edges (except in a very few special locations), it rests against those continents. Over time the sea rises and falls. When the sea is at its lowest point, with a good amount of its volume reduced because it is trapped in glacial ice, most of the continents are exposed. When the sea is at its highest point, vast areas of the continental margins are inundated. At present, the ocean is pretty high, covering much of the continental margin that it ever covers, but there is room to grow, with large areas of the coastline subject to future inundation.

Rising surface temperatures caused directly or indirectly by human release of greenhouse gas pollution melt glaciers and warm the ocean, both of which are causing the global sea level to rise. This is a long and complicated process. We add greenhouse gas, mainly CO2, to the atmosphere, and this causes warming, enhanced by various positive feedbacks that either cause an increase of additional greenhouse gases such as water vapor, methane, and more CO2, or reduce the ability of certain natural systems to absorb these gasses. The greenhouse gas causes warming, which causes more greenhouse gas, which causes more warming. Meanwhile, most of this extra heat is actually trapped in the ocean where it only contributes a little to melting glaciers, but does contribute to expanding the volume of the ocean. The ultimate amount of heating, and the ultimate amount of sea level rise, takes a long time to be realized, and the rate of this change is only roughly estimated.

What we have already done to the atmosphere will cause sea level rise to continue for a very long time, possibly many centuries, possibly thousands of years. We have increased the amount of CO2 in the atmosphere from the mid 200s parts per million (ppm) to 400ppm, and we expect that increase to continue for decades. Evidence from the past, through the science of paleoclimatology, tells us that when the atmosphere holds between 400ppm and 500ppm of CO2, the global sea level is many meters above the present level.

Understanding sea level change is therefore critically important to understanding the impacts of climate change. We can measure current sea level rise and assume that steady increase over time (even if it is a bit variable) is mostly caused by global warming, heating the ocean and melting glacial ice. But there are problems with these measurements and associated estimates. Recent research has shown that Antarctic, which holds most of the world’s ice, is or could or will contribute a very large amount of water to the sea. But, other recent studies show that some of the expected reduction in glacial size might not be happening at the rate previously estimated. At the moment, sea level is rising at a certain rate, and some research explains a good amount of that increase from melting ice, but other research takes that melting ice out of the equation and leaves that portion of the sea level rise unexplained, for now.

Past sea level change (up or down), prior to the industrial revolution when we started releasing all this greenhouse gas pollution, should give us a baseline against which to assess modern day measurements, and is an essential part of the process of understanding this critically important system. But it is difficult to measure sea level, at present or in the past. We can measure the current position of the sea at a given part of the continental margin by just going there and measuring it. Sea level over recent decades, going back in some places a few centuries, can be estimated using tide gage records. We can sink cores (or trenches) in relatively protected areas (such as behind barrier islands) and find organic material that would have been formed just below the surface of the sea, measure its elevation and date it, to give an estimate of sea level in the past. We can put the tide gage data and the coring data together and get a rough estimate of sea level change.

But that estimate is not just rough, but almost useless, without a lot of careful further study. As the organic material representing older sea levels is buried by later organic material or other sediment, it tends to be compressed and lower in elevation. The study of this process is many decades old, and this can be adjusted for, but it is complicated. The actual sea level at a given point along the coast depends partly on how big the ocean is at the moment (obviously) but also on the position and strength of major currents. At present, and many times in the past, the North Atlantic ocean is bunched up way out at sea because of the movement of currents. This lowers the sea level along the coast in many areas. But if these currents either move or change in their strength, this effect changes, and the coastal sea level goes up or down independently of the global sea level.

Wherever there were large glaciers, the land has been pushed down by the weight of the ice. After the glaciers melt away, the land rebounds. Where this happens along the coast, estimating global sea level from local sea level becomes quite complicated. Meanwhile at the outer edge of the glacial mass, the land is actually pushed up to compensate for the depression caused by the massive glaciers. This is called “forebuldge.” Forebuldge makes the sea level look lower than it should, until the forebuldge reduces and flattens out. Indeed, the rebound effects of enormous glaciers in Canada are still happening, changing the position of the shoreline of Hudson’s Bay fast enough that cabins built on the shore a century ago are now a long walk from the sea.

This is all manageable, and people have been working on collecting these data and figuring out how to use it since the 1960s. But now, this week, what may be the first research project to put most of these data together to provide a pretty good estimate of sea level variation over the last 3,000 years, has been published.

The key result from this paper is the graph at the top of this post.

Robert E. Koppa, Andrew C. Kemp, Klaus Bittermann, Benjamin P. Horton, Jeffrey P. Donnelly, W. Roland Gehrels, Carling C. Hay,b,k, Jerry X. Mitrovica, Eric D. Morrow, and Stefan Rahmstorf’s paper, “Temperature-driven global sea-level variability in the Common Era” (PNAS) does this:

We present the first, to our knowledge, estimate of global sea-level (GSL) change over the last ?3,000 years that is based upon statistical synthesis of a global database of regional sea-level reconstructions. GSL varied by ?±8 cm over the pre-Industrial Common Era, with a notable decline over 1000–1400 CE coinciding with ?0.2 °C of global cooling. The 20th century rise was extremely likely faster than during any of the 27 previous centuries. Semiempirical modeling indicates that, without global warming, GSL in the 20th century very likely would have risen by between ?3 cm and +7 cm, rather than the ?14 cm observed. Semiempirical 21st century projections largely reconcile differences between Intergovernmental Panel on Climate Change projections and semiempirical models.

So now we have a much better idea of the nature of global sea level rise for a couple thousand years prior to human greenhouse gas pollution, and we have a firm demonstration of the effects of this pollution on sea level over the last century or so.

We are fortunate that one of the authors of this paper, Stefan Rahmstorf, is a blogger at Real Climate, where he wrote this post summarizing the original paper (though the original paper, linked to above, is pretty readable!).

Climate Central produced this graphic based on the paper:


Of this, Rahmstorf says, “The fact that the rise in the 20th century is so large is a logical physical consequence of man-made global warming. This is melting continental ice and thus adds extra water to the oceans. In addition, as the sea water warms up it expands.”

How much will sea level rise by the end of the century?

In his post, Rahmstorf brings in a second study on sea level rise, also just published (see the RC post for more details). That research attempts to estimate the amount of sea level rise expectd by 2100. There are four separate studies, each using three different (RCP) assumptions about future human caused climate change, and each combination of study and model provides a range. In centimeters, the lowest numbers are around 25 (close to the amount that has already happened over the last century) and the highest numbers are around 130-150 (so, up to about five feet).

Rahmstorf appears to agree with my thinking on this, which is that these estimates don’t account for catastrophic deterioration of ice sheets and subsequent increase in melting, if such a thing results from what appears to be increasing instability of some of those glacial features. For example, huge parts of the Antarctic ice sheet are in the form of vast glacial rivers pinned in place by a precarious “grounding” of ice on rock near the mouth of those rivers.

If that grounding falls apart, the entire river can start to march to the sea very quickly, establishing a new grounding line upstream. It is possible that such a new grounding line is way upstream. As all that ice falls into the sea, it would likely expose high vertical cliff that would then start producing ice bergs at a very high rate for many years. There may be other features currently deep under the ice that would be exposed, such as pre-melted water near warm spots. In other words, the drainage of meltwater will not be made less efficient by such a collapse, but rather, more efficient, regionally and for a certain period of time. The point is, the impact on the rate of glacial melt of such events is pretty much unknown and very difficult to estimate.

Rahmstorf notes, “The projections on the basis of very different data and models thus yield very similar results, which speaks for their robustness. With one important caveat, however: the possibility of ice sheet instability, which for many years has been hanging like a shadow over all sea-level projections. While we have a pretty good handle on melting at the surface of the ice, the physics of the sliding of ice into the ocean is not fully understood and may still bring surprises. I consider it possible that in this way the two big ice sheets may contribute more sea-level rise by 2100 than suggested by the upper end of the ranges estimated by Mengel et al. for the solid ice discharge, which is 15 cm from Greenland and 19 cm from Antarctica. (The biggest contributions to their 131 cm upper end are 52 cm from Greenland surface melt and 45 cm from thermal expansion of ocean water.)”

He backs this up by reference to other recent studies showing that ice sheets have in the past broken up at surprisingly high rates.

One and a half meters, or five feet, of sea level rise within the lifetime of those born today is possible. Half of that is extremely likely. Double that may even be a possibility. This is expected to continue for centuries, even millennia, or until all the ice melts, whichever comes first.

How many things in your life originate from some thing that happened in the past? The invention of agriculture (that happened many times from about 10,000 to 4,000 years ago), the invention of writing (again, multiple times, thousands of years ago), the modern western system of government and law (depending on where you live, the Magna Carta, the US Constitution) hundreds of years ago. If you are religious, it is likely that your religion’s roots are thousands of years old. The establishment of property rights, water rights, all of that.

If human civilization exists, with some continuity with the present, 1,000 years from now, such a list will include the release of fossil carbon in the form of greenhouse gasses by the people of the 19th, 20th, and 21st centuries. That was the event that caused the sea to rise and engulf so much of the fertile land, causing a major (if possibly slow moving) exodus of most of the settled people of the world. In a thousand years, after we’ve either stopped using fossil fuel, or didn’t but just used it all up, people will still be measuring for the rise of the sea that we are causing right now.

I don’t think they will be thanking us.

Putting the “Ex” in “Exxon”: AGU asked to dump big oil sponsorship

It is all about the honest conversation. And the dishonest conversation.

Corporate Funding of the Research Endeavor: Good

Corporations have an interest in research. They use this research for profit or to minimize liability. Some corporations have their own researchers, some provide grants to scientists to conduct research, and some fund activities that might not be thought of as research, but really are. For example, the publication fees for peer reviewed journals, funds to pay for scientists to attend conferences, and funds to support a scientific conference are paying for an important part of the research endeavor.

It is not always the case that a conflict of interest arises when a corporation pays for research. In a former life, I was an administrator for a moderately sized research funding entity. We had “member” companies that paid annual dues that were rather high. In return for those dues, we provided experts who would show up and give talks. This was a total rip-off to the companies, because they also had to pay for the travel costs of the experts, but that is not why they contributed. These were Japanese companies, and the experts were all economists. The point was to distribute the money to young scholars — graduate students, post docs, and junior faculty — for whatever research projects they needed money for. The projects had to be real research, but they did not have to be on anything in particular. The results were generally put into a free and open access publication series (along with other research) and we would ship off copies of the publication to all the member companies. Nobody was paying anybody to produce any particular result, but the research was sometimes (but often not) valuable to those companies. For example, some Japanese companies, including at least one that paid us dues, had developed a great new way to manage warehousing of parts. It saved money and reduced waste. One of the research projects we funded looked at that system, compared it to other systems, and recommended how it might be applied elsewhere. In another project, one of the firs studies to ever look at putting some kind of price on carbon was carried out. None of the companies that funded this research had any interest, for or against, this concept.

In the old days, AT&T funded Bell Labs. It still exists today, and I have no idea how it works now. I’m told by people who worked there back in the mid 20th century that it was a place where funding came in from the mother company to allow scientists to do more or less what they wanted to. Numerous important inventions that we use today came out of Bell Labs, and the people who worked there even won a bunch of Nobel Prizes. That was probably another example of industry funding research for the purpose of finding out new stuff, and little or no nefarious intent was attached.

Conferences are typically funded by a combination of grants from institutions (like the National Science Foundation, etc.), conference fees (which can be rather hefty) charged to participants, and grants from interested commercial parties. For example, a company that makes microscopes might kick in some money for a biology conference. They may also be represented in the part of the conference where private companies (or institutions with a product) can set up booths (that they pay for), like a trade conference.

Those private companies may well have an interest in the outcome of the research being performed by the various scientists who attend the conference. Maybe they want to sell the scientists a gadget to use in their lab. Maybe they want to use the research to advance their corporate mission, such as better ways to produce or deliver a product. Most of the time they probably just want people to like them, or to recognize their names.

So far, there is not much wrong with that, either.

Corporate Funding of the Research Endeavor: Bad

But sometimes private corporations have different kind of interest. They don’t just want to get more information and knowledge about the areas where science overlaps with their corporate mission. They don’t just want to be seriously considered as a source for some matériel or equipment that scientists use. What some corporations want to do, sometimes, is to influence the outcome of scientific research, for their own interests, in ways that require that the science itself be adulterated in some substantial way. They want to see the dissemination of results that may be bogus but that serves their financial interests, or they may want to repress results that would lead policy makers, legislatures, the public, or the scientific community, to criticize, eschew, or even stop one or more of their profitable activities.

This is a sufficiently important problem that one of the largest (possibly the largest, depending on how one defines things) scientific organizations related to the study of Planet Earth, the American Geophysical Union (AGU), has a policy about this. As part of their “organizational support policy,” the AGU says,

AGU will not accept funding from organizational partners that promote and/or disseminate misinformation of science, or that fund organizations that publicly promote misinformation of science.

Organizational partners are defined as those that make an annual financial commitment to AGU
of $5,000 or more.

Why not accept the money? Doesn’t it make sense to take the money and then have lots of money and stuff, and ignore the wishes of potentially nefarious actors in this game?

I knew a guy once, only barely (a friend of the father of a friend). He was a major research scientist at a major institution, and he invented a technology for seeing things that are very small, which had applications in a wide range of research and praxis, including materials science and medicine. But his methodology involved the development of technology that one might use to make a terrible but effective weapon. He received a lot of his funding from those who might fund such things, and this allowed him to do his work without having to spend much money on grant proposals. But, he claimed (in his retirement), he never intended his work to be used to make a terrible weapon. Furthermore, he knew, privately, from his own research that it never could be. What he was doing would simply not work in that context. But he never mentioned that to his funders. He just took the money, and used it to save lives.

Well, one of the reasons one might not want to take money from sources with nefarious intent (and here we assume developing a terrible weapon is nefarious, though one could argue differently, I suppose) without ever advancing said nefarious goal, is that it is actually unethical. But one could counter argue that the savings of lives and advancement of civilization and such outweighs the ethics, or more exactly, that it is appropriate to develop situational ethics.

That is an extreme example, but in some ways, parallel to what a major organization like the AGU would be doing if they knowingly accepted money from major corporations who intended to encourage, develop, disseminate, or otherwise use for their own interests any kind of fake science or anti-science. Why not take the money and run? Partly, one assumes, because it isn’t exactly kosher.

Another reason is that if one takes anti-science money, one may end up advancing anti-science agendas even if one does not want to. The very fact that an anti-science entity (a corporation or foundation funded by a corporation) funds a major legit conference is a way of saying that the corporation itself is legit. It is a way that a scientific organization can advance anti-science even if it doesn’t want to.

Scientist Tell AGU To Drop Exxon Sponsorship

You all know about the Exxon maneno. Exxon, aka ExxonMobil, has recently been exposed as having repressed scientific information that indicated that we, our species, would ultimately need to change our energy systems in order to keep fossil fuels in the ground, else face dire consequences. Decades ago, when the science already indicated that this was a problem, Exxon independently verified that we needed to keep the fossil fuels in the ground, then shut up about it, because it was, and is, in their corporate interest to take the fossil fuel out of the ground.

I wrote about the Exxon kerfuffle back when it first broke, here. In that post, I provided a thumb-suck analysis comparing what Exxon knew about climate change then, and what the IPCC and NASA know about it now. They are pretty much the same, with respect to global surface warming caused by the human release of greenhouse gas pollution from burning fossil fuels such as those extracted and sold by Exxon.

Over a month ago, scientists Ploy Achakulwisut, Ben Scandella, Britta Voss asked the question, “Why is the largest Earth science conference still sponsored by Exxon?” They noted,

The impacts of Exxon’s tactics have been devastating. Thanks in part to Exxon, the American public remains confused and polarized about climate change. Thanks in part to Exxon, climate science-denying Republicans in Congress and lobby groups operating at the state level remain a major obstacle to U.S. efforts to mitigate climate change.

And thanks in no small part to Exxon, climate action has been delayed at the global level; as the international community began to consider curbing greenhouse gas emissions with the Kyoto Protocol in 1997, Exxon orchestrated and funded anti-Kyoto campaigns, including participation in the Global Climate Coalition. The latter was so successful at shifting debate that the George W. Bush administration credited it with playing a key role in its rejection of the Kyoto Protocol.

So, now there is a letter signed by many top scientists asking the American Geophysical Union to make ExxonMobile an Ex-contributor to the conference. According to the Natural History Museum,

more than 100 geoscientists sent the following letter to the President of the American Geophysical Union (AGU) – the world’s largest association of Earth scientists – urging the association to end its sponsorship deal with ExxonMobil. The oil giant is currently under investigation by the New York and California Attorneys General for its long history of climate denial campaigns.

Many notable scientists have signed on, including the former director of NASA Goddard Institute for Space Studies James E. Hansen, the former President of the American Association for the Advancement of Science and Harvard Professor James J. McCarthy, Harvard Professor and author of Merchants of Doubt Naomi Oreskes, and Michael Mann– Director of the Earth System Science Center at Pennsylvania State University.

The letter is the most recent example of a growing trend of scientists stepping out of their traditional roles to urge science institutions to cut ties to fossil fuel companies.

As part of the press release announcing this letter, Michael Mann (author of The Hockey Stick and the Climate Wars: Dispatches from the Front Lines, and Dire Predictions, 2nd Edition: Understanding Climate Change) noted, “While I recognize that it is a contentious matter within the diverse AGU community, I just don’t see how we can, in good conscience, continue to accept contributions from a company that has spent millions of dollars over several decades funding bad faith attacks on scientists within our community whose scientific findings happen to be inconvenient for fossil fuel interests.”

InsideClimateNews has a timeline of what happened with Exxon, here.

AGU’s president, Margaret Leinen, wrote on the AGU’s blog, that “The AGU Board of Directors will take up the questions raised in this letter at their upcoming meeting in April, and prior to that will carefully review the information that has been provided, and any additional information that becomes available in the meantime. We will consult with our various member constituencies as well other stakeholders prior to the Board meeting. In addition, the Board will look more deeply into the question of what constitutes verifiable information about current activities.”

InsideClimateNews notes that this campaign “…is part of a growing trend of scientists’ protesting efforts by fossil fuel companies to undermine climate science. Last year, for instance, dozens of researchers urged Smithsonian’s National Museum of Natural History and the American Museum of Natural History in New York to cut ties with David Koch of Koch Industries.” See this post at InsideClimateNews for more information about the Exxon-AGU problem, and the broader movement.

As I noted at the beginning, this is all about the honest conversation. I’ve talked about this before. So often, the conversation, usually public and policy-related, is not about the science at all, but about other things, and the science itself gets thrown under the bus. My understanding (limited, I know) of the criminal justice system is that if a prosecutor knows about exculpatory evidence, they are required to provide it to the court or defense, thus possibly negatively affecting their own chance of success, but at the same time, doing the right thing. One would think that in science, institutions or individuals who know about evidence important in understanding some scientific problem, that they are ethically obligated to make that information available with reasonable alacrity. If all those involved in the large scale and complex conversations about climate change and energy had as a central ethical theme a commitment to accuracy, openness, and to the process of mutual aid in advancing our understanding of the topics at hand, it wouldn’t matter who gave money to whom, because that money would not be linked to efforts to repress knowledge or to produce and disseminate misinformation.

And, certainly, such corporations should not be attacking the science or the scientists, or funding other organizations that do. Contributing to a valid scientific organization like the AGU does not make up for such behavior.

Had that been the way things worked fifty years ago, by now, Exxon-Mobile and other fossil fuel companies would have shifted their corporate activities away from fossil fuels. They would be phasing out coal, oil, and natural gas, and developing clean energy solutions. They would not have stuck themselves with vast stranded assets that they now have a corporate responsibility, no matter how immoral or antiscientific, to develop. There is an idea that corporations are primarily responsible to their stockholders, and this widely accepted but highly questionable “ethic” has been applied to justify, it seems, a significant departure from the pursuit of knowledge and the application of that knowledge to managing human problems and protecting our precious planet. This is a fundamental flaw in how we do things, and it is the reason AGU has to but the “ex” in Exxon as a sponsor.

Scientists’ Letter to the American Geophysical Union

Here is the letter:

Dear Dr. Margaret Leinen,

We, the undersigned members of AGU (and other concerned geoscientists), write to ask you to please reconsider ExxonMobil’s sponsorship of the AGU Fall Meetings.

As Earth scientists, we are deeply troubled by the well-documented complicity of ExxonMobil in climate denial and misinformation. For example, recent investigative journalism has shed light on the fact that Exxon, informed by their in-house scientists, has known about the devastating global warming effects of fossil fuel burning since the late 1970s, but spent the next decades funding misinformation campaigns to confuse the public, slander scientists, and sabotage science – the very science conducted by thousands of AGU members. Even today, Exxon continues to fund the American Legislative Exchange Council, a lobbying group that routinely misrepresents climate science to US state legislators and attempts to block pro-renewable energy policies. Just last year, Exxon CEO Rex Tillerson downplayed the validity of climate models and the value of renewable energy policies.

The impacts of Exxon’s tactics have been devastating. Thanks in part to Exxon, the American public remains confused and polarized about climate change. And thanks in part to Exxon, climate science-denying members of Congress and lobby groups operating at the state level remain a major obstacle to US efforts to mitigate climate change.

The research disciplines of Earth sciences conducted by AGU members are diverse, but they are united by their shared value of truthfulness. AGU states that its mission and core values are to “promote discovery in Earth science for the benefit of humanity” and for “a sustainable future.” Indeed, AGU has established a long history of scientific excellence with its peer-reviewed publications and conferences, as well as a strong position statement on the urgency of climate action, and we’re proud to be included among its members.

But by allowing Exxon to appropriate AGU’s institutional social license to help legitimize the company’s climate misinformation, AGU is undermining its stated values as well as the work of many of its own members. The Union’s own Organizational Support Policy specifically states that “AGU will not accept funding from organizational partners that promote and/or disseminate misinformation of science, or that fund organizations that publicly promote misinformation of science.” We believe that in fully and transparently assessing sponsors on a case-by-case basis, AGU will determine that some, including ExxonMobil, do not meet the standards of this policy. We therefore call on you as the President of AGU to protect the integrity of climate science by rejecting the sponsorship of future AGU conferences by corporations complicit in climate misinformation, starting with ExxonMobil.

While we recognize that some of AGU’s scientific disciplines are deeply tied to the fossil fuel industry, we are also increasingly aware of the tension within our community regarding how we should respond to the urgency of climate change as individual scientists and as institutions. It is time to bring this tension into the light and determine how an organization such as AGU should approach the major challenges of today to ensure that we truly are working for the benefit of humanity. In particular, as the world’s largest organization of Earth scientists, if we do not take an active stand against climate misinformation now, when will we?

Yours respectfully,

AGU members:

Robert R. Bidigare, PhD, AGU Fellow, University of Hawaii

Cecilia Bitz, Professor, University of Washington

David Burdige, Professor and Eminent Scholar, Old Dominion University

Kerry Emanuel, Professor, MIT

Peter Frumhoff, PhD, Director of Science and Policy, Union of Concerned Scientists

Richard H. Gammon, Professor Emeritus, University of Washington

Catherine Gautier, Professor Emerita, University of California Santa Barbara

Charles Greene, Professor, Cornell University

James E. Hansen, Adjunct Professor, Columbia University

Charles Harvey, Professor, MIT

Roger Hooke, Research Professor, University of Maine

Mark Z. Jacobson, Professor, Stanford University

Dan Jaffe, Professor and Chair, University of Washington Bothell

Michael C. MacCracken, Chief Scientist for Climate Change Programs, Climate Institute

Michael E. Mann, Distinguished Professor, Penn State University

James J. McCarthy, Professor, Harvard University

James Murray, Professor, University of Washington

Naomi Oreskes, Professor, Harvard University

Nathan Phillips, Professor, Boston University

Christopher Rapley, CBE, Professor, University College London

Richard Somerville, Distinguished Professor Emeritus, University of California San Diego

Pattanun Achakulwisut, PhD Student, Harvard University

Becky Alexander, Associate Professor, University of Washington

Theodore Barnhart, PhD Student, University of Colorado/INSTAAR

Yanina Barrera, PhD Student, Harvard University

Dino Bellugi, PhD Candidate, University of California Berkeley

Jo Browse, Postdoctoral Research, University of Leeds, UK

Adam Campbell, Postdoctoral Fellow, University of Otago

Chawalit Charoenpong, PhD Student, MIT/WHOI Joint Program

Sarah Crump, PhD Student, University of Colorado Boulder

Daniel Czizco, Associate Professor, MIT

Katherine Dagon, PhD Student, Harvard University

Suzane Simoes de Sá, PhD Student, Harvard University

Michael Diamond, PhD Student, University of Washington

Kyle Delwiche, PhD Student, MIT

Sarah Doherty, Associate Professor, University of Washington

Liz Drenkard, Postdoctoral Researcher, Rutgers University

Emily V. Fischer, Assistant Professor

Priya Ganguli, Postdoctoral Fellow

Gretchen Goldman, PhD, Lead Analyst, Union of Concerned Scientists

Meagan Gonneea, Postdoc

Jordon Hemingway, PhD Student, MIT/WHOI Joint Program

Hannah Horowitz, PhD Student, Harvard University

Irene Hu, PhD student, MIT

Lu Hu, Postdoctoral Researcher, Harvard University

Eric Leibensperger, Assistant Professor, State University of New York at Plattsburgh

Marena Lin, PhD Student, Harvard University

Simon J. Lock, PhD Student, Harvard University

Andrew McDonnell, Assistant Professor, University of Alaska Fairbanks

Bruce Monger, Senior Lecturer, Cornell University

Daniel Ohnemus, Postdoctoral Researcher, Bigelow Laboratory for Ocean Sciences

Morgan O’Neill, Postdoctoral Fellow, Weizmann Institute of Science

Cruz Ortiz Jr., PhD Student, University of California Santa Barbara

Jonathan Petters, Research Fellow, University of California Santa Cruz

Allison Pfeiffer, PhD Student, University of California Santa Cruz

James L. Powell, PhD

Christina M. Richardson, MS Student, University of Hawaii Manoa

Ignatius Rigor, Senior Principal Research Scientist, University of Washington

Paul Richardson, Postdoctoral Fellow, University of Oregon

Erica Rosenblum, PhD Student, Scripps Institution of Oceanography

Ben Scandella, PhD Student, MIT

Neesha Schnepf, PhD Student, University of Colorado at Boulder/CIRES

Amos P. K. Tai, Assistant Professor, The Chinese University of Hong Kong

Robert Tardif, Research Scientist

Katherine Travis, PhD Student, Harvard University

Britta Voss, Postdoctoral Fellow

Andrew Wickert, Assistant Professor, University of Minnesota

Kyle Young, Graduate Student, University of California Santa Cruz

Xu Yue, Postdoctoral Associate, Yale University

Emily Zakem, PhD Student, MIT

Cheryl Zurbrick, Postdoctoral Associate, MIT


Other concerned geoscientists:

Hans Joachim Schellnhuber, CBE, Professor, Potsdam Institute for Climate Impact Research

Helen Amos, Postdoctoral Fellow, Harvard University

Antara Banerjee, Postdoctoral Research Scientist

Emma Bertran, PhD Student, Harvard University

Skylar Bayer, PhD Student

Thomas Breider, Postdoctoral Researcher, Harvard University

Stella R. Brodzik, Software Engineer, University of Washington

BB Cael, PhD Student, MIT/WHOI Joint Program

Sophie Chu, PhD Student, MIT/WHOI Joint Program

Archana Dayalu, PhD Student, Harvard University

Gregory de Wet, PhD Student, University of Massachusetts Amherst

Christopher Fairless, PhD Student, University of Manchester, UK

Mara Freilich, PhD Student, MIT

Wiebke Frey, Research Associate, University of Manchester, UK

Nicolas Grisouard, Assistant Professor, University of Toronto

Sydney Gunnarson, PhD Student, University of Iceland/University of Colorado Boulder

Sam Hardy, PhD Student, University of Manchester, UK

David Harning, PhD Student, University of Colorado Boulder

Sophie Haslett, PhD Student, University of Manchester, UK

Richard Hogen, Aerospace Thermodynamic Engineer, United Launch Alliance

Anjuli Jain, PhD Student, MIT

Harriet Lau, PhD Student, Harvard University

Cara Lauria, Masters Student, University of Colorado Boulder

Franziska Lechleitner, PhD Student, ETH Zu?rich

Michael S. Long, Research Scientist

John Marsham, Associate Professor, University of Leeds, UK

Catherine Scott, Postdoctoral Research Fellow, University of Leeds, UK

Rohini Shivamoggi, PhD student, MIT

Victoria Smith, PhD, Instrument Scientist, National Center for Atmospheric Science, University of Leeds, UK

Gail Spencer, Environmental Specialist, Washington Department of Ecology

Melissa Sulprizio, Scientific Programmer, Harvard University

Rachel White, Postdoctoral Associate, University of Washington

Leehi Yona, BA, Senior Fellow, Dartmouth College

Yanxu Zhang, Postdoctoral Researcher, Harvard University

This is the greatest idea ever: Water Bar

One of the Great Crises we face in today’s world is the stability and security of the water supply. In America, most people don’t have any problems getting water, to the extent that we tend to waste it, and few people even know where there water comes from. Every now and then there emerges a startling and troubling problem with water. A river catches fire, a plume of visible pollution observable from space spreads across a lake, or an entire city worth of children are poisoned with the contents of the city water supply.

Works Progress Studio has been engaged for a while in a project called the Water Bar, and they intend to ramp this project up in the near future if they get enough help.

The original water bar is “a collaborative public art project … simply, a bar that serves local tap water.” It consists of a pop up bar that can be easily deployed, serving a wide range of local vintages, and staffed with scientists or other experts on the water supply. That project started in 2014, and has served over 30,000 people in four states.

This video gives you a flavor (or, should I say, a flavorless…).

I asked Works Progress Studio co-founder Shanai Matteson how this all started. She told me that it “… started out as an experiment – what would happen if we opened a bar that only served regular tap water, and asked our community of environmental science researchers, educators, and advocates to be bartenders – not pushing a message, but just casually engaging in conversation.”

“The second iteration of the project,” she continued, “was an installation at an art museum in Arkansas. We built a Water Bar in the museum’s cafe area, and I think a lot of people didn’t even realize it was an artist project – which is fine with us! We hired college students with backgrounds in research, natural resource management, landscape architecture, business… They kept the bar open every day for 5 months, and all of them said that they learned valuable engagement skills, including new ways of talking to people about complicated science topics”

Now, Water Bar has a GoFundMe page to help them to set up a permanent taproom in Minneapolis. Partnering with several neighborhood and environmental organizations, research scientists, and artists, the idea is to create the Water Bar & Public Studio in Northeast Minneapolis, which is a thriving, and growing, art-oriented community. The location will be a hub for neighborhood events addressing art and sustainability, educational programs, and so on.

The water will be free.

Donations will fund the “taproom,” a creative community space, and a public art and sustainability incubator.

When I saw the video, my first thought was to avoid doing this in Flint Michigan. I asked Shanai Matteson about that. She told me, “We’ve actually had a bunch of people suggest we SHOULD do this in Flint, or Detroit. We wouldn’t attempt that unless we were invited there by residents, but even considering the implications really makes the disparity between those communities and others, like Minneapolis, so plain.”

Matteson also pointed out one of the main problems with the culture of water use in the US. “Most of the stories about Flint have focused on the problem – what went wrong, who was responsible – as well as the work of researchers, residents, and activists to finally get people to pay attention. Few of the stories I’ve read mention that almost none of us know where our drinking water comes from. We probably wouldn’t know if our water had high levels of lead, and most of us wouldn’t know who to call or what to do if we suspected a problem. One of our goals with the Water Bar project is to start getting people to see and understand their connection to these life-sustaining systems, and to the political systems involved with maintaining them – or in the case of Flint, gross negligence and a desire to see public infrastructure privatized.”

Matteson is looking forward to developing this project further. “Our dream is for a space that is approachable and welcoming, but also presents really urgent and serious content. We want to work with our community of artists and designers to find creative ways to engage people in water and environment issues, and we want to be a learning laboratory for future science and environment leaders — or for current researchers and advocacy orgs to share their work with new audiences.”

You can learn more about the Water Bar project here, and of course, go here to go fund them.