In 2005, I argued that ice sheets may be more vulnerable than the Intergovernmental Panel on Climate Change (IPCC) estimated, mainly because of effects of a warming ocean in speeding ice melt. In 2007, I wrote“Scientific Reticence and Sea Level Rise,” describing and documenting a phenomenon that pressures scientists to minimize the danger of imminent sea level rise.
About then I became acquainted with remarkable studies of geologist Paul Hearty. Hearty found strong evidence for sea level rise late in the Eemian to +6-9 m (20-30 feet) relative to today. The Eemian is the prior interglacial period (~120,000 years ago), which was slightly warmer than the present interglacial period (the Holocene) in which civilization developed. Hearty also found evidence for powerful storms in the North Atlantic near the end of the Eemian period.
It seemed that an understanding of the late Eemian climate events might be helpful in assessing the climate effects of human-made global warming, as Earth is now approaching the warmth that existed then. Thus several colleagues and I initiated global climate simulations aimed at trying to understand what happened at the end of the Eemian and its relevance to climate change today.
More than eight years later, we are publishing a paper describing these studies. We are publishing the paper in an open-access “Discussion” journal, which allows the paper to become public while undergoing peer-review (a pdf of the paper with figures imbedded in the text for easier reading is available here). I will get to the reasons for that in a moment, but first let me mention some curious numerology to get you thinking about scientific reticence.
Did you read any of the recent papers that concluded ice sheets may be disintegrating and might cause large sea level rise in 200-900 years? The time needed for ice sheets to respond to climate change is uncertain, and there are proponents for time scales covering a huge range. However, 200-900 years should cause a scientist to scratch his head. If it is uncertain by an order of magnitude or more, why not 100-1000? Where does the 200-900 precision come from?
Why the peculiar 900 years instead of the logical 1000? Probably because nobody cares about matters 1000 years in the future (they may not care about 900, but 200-900 does not seem like infinity). A scientist knowing that sea level is a problem does not want the reader to dismiss it.
Why 200 years? For one thing, 100 years would require taking on the formidable IPCC, which estimates that even the huge climate forcing for a hypothetical 936 ppm CO2 in 2100 would yield less than one meter sea level rise. For another thing, incentives for scientists strongly favor conservative statements and militate against any “alarmist” conclusion; this is the “reticence” phenomenon that infects the sea level rise issue2. “Scientific Reticence and Sea Level Rise” will be the subject of a session at the American Geophysical Union meeting this year.
Fig. 1. Stratification and precipitation amplifying feedbacks. Stratification: increased freshwater/iceberg flux increases ocean vertical stratification, reduces AABW formation, and traps ocean heat that increases ice shelf melting. Precipitation: increased freshwater/iceberg flux cools ocean mixed layer, increases sea ice area, causing increase of precipitation that falls before it reaches Antarctica, adding to ocean surface freshening and reducing ice sheet growth. Retrograde beds in West Antarctica and the Wilkes Basin, East Antarctica make their large ice amounts vulnerable to such melting.
IPCC conclusions about sea level rise rely substantially on models. Ice sheet models are very sluggish in response to forcings. It is important to recognize a great difference in the status of (atmosphere-ocean) climate models and ice sheet models. Climate models are based on general circulation models that have a long pedigree. The fundamental equations they solve do a good job of simulating atmosphere and ocean circulations. Uncertainties remain in climate models, such as how well they handle the effect of clouds on climate sensitivity. However, the climate models are extensively tested, and paleoclimate changes confirm their approximate sensitivities.
In contrast, we show in a prior paper and our new paper that ice sheet models are far too sluggish compared with the magnitude and speed of sea level changes in the paleoclimate record. This is not surprising, given the primitive state of ice sheet modeling. For example, a recent ice sheet model sensitivity study finds that incorporating the physical processes of hydrofracturing of ice and ice cliff failure increases their calculated sea level rise from 2 meters to 17 meters and reduces the potential time for West Antarctic collapse to decadal time scales. Other researchers7,8 show that part of the East Antarctic ice sheet sits on bedrock well below sea level. Thus, West Antarctica is not the only potential source of rapid change; part of the East Antarctic ice sheet is also susceptible to rapid retreat because of its direct contact with the ocean and because the bed beneath the ice slopes landward (Fig. 1), which makes it less stable.
Our simulations were aimed to test my suspicion that ice sheet disintegration is a very nonlinear phenomena and that the IPCC studies were largely omitting what may be the most important forcing of the ocean: the effect of cold freshwater from melting ice. Rather than use an ice sheet model to estimate rates of freshwater release, we use observations for the present ice melt rate and specify several alternative rates of increase of ice melt. Our atmosphere-ocean model shows that the freshwater spurs amplifying feedbacks that would accelerate ice shelf and ice sheet mass loss, thus providing support for our assumption of a nonlinear ice sheet response.
Our analysis, however, is based on much more than the climate simulations, as it relies on a huge body of research by the relevant scientific communities, as indicated by the 300 references. Our analysis is based on about equal parts of information gleaned from paleoclimate studies, climate modeling, and modern observations of ongoing climate changes.
We submitted our paper to an open-access “Discussion” journal (ACPD) in hopes of engaging the scientific and policy-making communities in an important conversation about the urgency of reducing fossil fuel emissions and the adequacy of current and proposed policies. We conclude, for example, that 2°C global warming, rather than being a safe “guardrail,” is highly dangerous.
Atmospheric Chemistry and Physics Discussion is an open-access peer-reviewed journal in which the reviews and our response are published and freely available to the public. We hope this publication procedure will reduce the chance of the paper turning out to be unhelpful, which might be the case if criticisms were misinterpreted by the public. I think there is an analogy of this paper to my congressional testimony in 1988-89. Then as now, conclusions are drawn from a combination of information from paleoclimate, modeling, ongoing observations, and theory.
Stakes in climate change are high, so conclusions about climate change are sure to draw fire. That’s as it should be; skepticism is the lifeblood of science, essential to success of an analysis. So criticisms of my testimony, asdescribed well by Richard Kerr, were inevitable and useful.
Kerr’s article is instructive about scientific reticence, which can deprive policymakers of the gut feeling of experts. This is all important for sea level rise because of lags in the system (policies → emissions → climate change → sea level rise). Information is needed as soon as possible.
The most perceptive comments in Kerr’s interviews may have been, as was often the case, from our good old friend Steve Schneider: “All that objective stuff rests on assumptions. The future is not based on statistics, it’s based on physics.” By “objective stuff” Steve referred to the arbitrary choices made to define probabilities of an outcome. The media accepts resulting probabilities as meaningful, yet entirely different results would be obtained from alternative initial choices.
Steve’s “objective stuff” defines IPCC’s sea level analysis precisely. They choose certain ‘process-based models’ as first choice to define future sea level. This gives sea level rise in 2100 (relative to 1986-2005 mean sea level) of 0.74 m with likely range 0.52-0.98 m for business-as-usual greenhouse gases (RCP8.5 scenario), where ‘likely’ is defined as >66 percent probability. Ugh.
Fig. 2. Surface air temperature change relative to 1880-1920 in 2055-2060 based on climate simulations assuming ice melt increases with a 10-year doubling time.
A policymaker will take this as meaning that sea level rise is probably going to be less than a meter even if CO2 increases to 936 ppm, in other words, policymakers will take this “objective stuff” as serious, reliable estimates of what to expect. Yikes! What if someone decided to include processes such as hydrofracturing and ice cliff failure in these objective models?
Steve Schneider modestly described his preferred approach as one based on “physical intuition”. In other words, his best judgment based on all of the information at his disposal. “All of the information” surely includes knowledge gained from paleoclimate, modeling, observations of ongoing climate change, understanding of physical processes, etc. Of course, with this approach there is no way to specify an exact number for the sea level rise corresponding to >66 percent chance. Nevertheless, alternatives to the “objective stuff”, at least in this case, are superior, in my opinion, but the result does depend on the scientific ability of the practitioner.
Dick Kerr is one of the best science writers. His article contains information relevant to the scientific method in general and how we reach conclusions, not just scientific reticence. He allows readers to think and read between the lines, and draw their own conclusions.
We can always say that more research is needed. Yet as the evidence accumulates at some point a scientist must say it is time to stop waffling so much and say that the evidence is pretty strong. In my opinion, we have reached that point on the sea level issue.
My conclusion, based on the total information available, is that continued high emissions would result in multi-meter sea level rise this century and lock in continued ice sheet disintegration such that building cities or rebuilding cities on coast lines would become foolish.
That brings me to the other reason for publishing in an open-access “discussion” journal, in addition to wanting to give the sea level rise issue more prominence prior to Paris meetings. There is a danger that the public — not too familiar with the scientific method — may misinterpret criticisms, which are natural and healthy for science. I’m hoping that this publication process will make that process clearer and thus also make the reality of the climate situation clearer.
A startling conclusion of our paper is that effects of freshwater release onto the Southern Ocean and North Atlantic are already underway and 1-2 decades sooner in the real world than in the model (Fig. 2). Observed effects include sea surface cooling and sea ice increase in the Southern Ocean around Antarctica and cooling in the North Atlantic. We suggest that the sluggishness (delayed response) of the climate models may be a result of a common excessive small scale mixing in many ocean models, including ours, as discussed previously. One of our objectives is to draw attention to this — I also hope to get support for our group to do climate modeling to investigate the issue, because we recognize several ways that we could improve the model.
Here, I expand on our conclusion that the science indicates 2°C is not a safe target. Indeed, 2°C is not only a wrong target, temperature is a flawed metric due to meltwater effect on temperature. Sea level, a critical metric for humanity, is at least on the same plane. Earth’s energy imbalance is a critical metric, because energy balance must be restored to stabilize climate, which thus informs us about the required limit on greenhouse gases (GHGs). The Framework Convention on Climate Change, agreed upon at Rio in 1992, defines GHGs as the critical metric, saying that GHGs must be stabilized at a level that avoids “dangerous anthropogenic interference” with climate. Why have policymakers turned away from GHG amount to temperature as the metric with a value (2°C) seemingly pulled from a hat? Could it be because 2°C allows politicians to set emission targets to be achieved in the future when they will be out of office? If we stick to the Framework Convention’s GHG metric, we find that the CO2 stabilization level is not 450 ppm or 400 ppm, it is 350 ppm and possibly lower with immediate implications for policy.
The bottom line message scientists should deliver to policymakers is that we have a global crisis, an emergency that calls for global cooperation to reduce emissions as rapidly as practical. We conclude elsewhere and reaffirm in our present paper that the crisis calls for an across-the-board rising carbon fee and international technical cooperation in carbon-free technologies.
Despite the increased threat of sea level rise, I believe that it is still possible to keep impacts of human-made climate change moderate. However, that optimism is based on the assumption that we are close to the point when it is widely recognized that a policy with an across-the-board rising carbon fee that rapidly phases down carbon emissions also makes good economic sense.