By Arnold Kling
Amidst the usual riff-raff of comments on my previous post, I received at least one good suggestion, which was to look at the web site realclimate.org, and in particular their FAQ page.One question asks, Just What is the Consensus?. The answer is
1. The earth is getting warmer…0.17 oC/decade over the last 30 years
2. People are causing this…
3. If GHG emissions continue, the warming will continue and indeed accelerate…
4. (This will be a problem and we ought to do something about it)
The last point is in parentheses, because “whilst many would agree, many others (who agree with 1-3) would not, at least without qualification. It’s probably not a part of the core consensus in the way 1-3 are.”
My impression of the Al Gore movie, based on the trailer, is that it is designed to beat up on anybody who disagrees with any of the consensus, including (4). I think that the movie takes the climate issue away from people like those at realclimate.org, who strike me as rational and capable of admitting to imprecision and alternative courses of action, and makes it a religious issue, in which imprecision and alternative courses of action are punishable offenses.
Another FAQ post deals with the issue of whether or not climate modelling is truly a science. Read the whole thing. I will comment on some selected excerpts.
Climate is complex. Since climatologists don’t have access to hundreds of Earth’s to observe and experiment with, they need virtual laboratories that allow ideas to be tested in a controlled manner. The huge range of physical processes that are involved are encapsulated in what are called General Circulation Models (or GCMs). These models … are based on physical theories and empirical observations made around the world. However, some processes occur at scales too small to be captured at the grid-size available in these (necessarily global) models. These so-called ‘sub-gridscale’ processes therefore need to be ‘parameterised’
…This means that validating these models is quite difficult. (NB. I use the term validating not in the sense of ‘proving true’ (an impossibility), but in the sense of ‘being good enough to be useful’). In essence, the validation must be done for the whole system if we are to have any confidence in the predictions about the whole system in the future. This validation is what most climate modellers spend almost all their time doing.
This does in fact sound like the way that major macroeconometric models were constructed. There were many problems with these models. However, in my opinion, the biggest issue is that there are many more parameters than data points. In macro, there really are only a few interesting episodes, such as the Great Depression, the 70’s stagflation, and the disinflation of the 1980’s. There are many causal variables that could be implicated in these episodes.
When there are many more variables than true data points, it is impossible to rule out competing hypotheses. This problem can be glossed over as long as there is a consensus among the model-builders, which there was in the 1960’s. If everyone agrees on an approach to building a model, the models will tend to provide broadly similar answers.
But as it turned out in macro, the consensus proved fragile, and it fell apart in the late 1970’s. By that point, we had people publishing papers pointing out that the data could not distinguish between an economy obeying a 200-equation macro model and one obeying a univariate random walk. There was a paper saying that every business cycle was an oil shock (that was James Econbrowser Hamilton’s Ph.D dissertation). There was even a famous paper, alluded to here, which argued that sunspot activity was as useful an explanation of the business cycle as any major macro-econometric model.
My take on macro is that the neo-Keynesian approach, which underlies the big econometric models and which is taught in intro econ courses, may still be correct. But so could a number of other models, some of which have very different implications for how macroeconomic policy affects unemployment and inflation. Anyone who insists on one particular model is a religious zealot, not a scientist.
My guess is that I would look at climate modelling the same way. The complexity of the process far exceeds the availability of data needed to verify the model. Even a broad consensus may prove fragile.
I have no alternative to the climate models. I think we ought to take their results seriously. But my instinct is that we should be prepared for the possibility that they are way off base.
I believe that the climate models will tell you that a lot of the global warming predicted for the next few decades is “baked in,” if you will pardon the expression. That is, even if we held human CO2 emissions to current levels, or reduced them by 10 percent, the trend of temperatures would be up rather than down.
In fact, the question of how sensitive future global warming is to our CO2-producing activities is one that I wish were addressed more directly in the FAQ. My impression is that the models show relatively small effects, which means that it takes a relatively large reduction in CO2 emissions to get a small reduction in the path of global temperatures. In turn, this suggests that policies to fight global warming now have a large buck and a small bang. We may be in a better position to address global warming in ten years, with better technology (and perhaps with better climate models).
The choice that we face on global warming is not “either-or.” It is not, “either we believe in global warming and rescue the planet or we all die.” It is a largely a choice between how much action we do pro-actively now and how much we do in response to climate change in forthcoming decades.
I think that there is time to have a reasonable discussion and to make rational decisions. If that’s the way people want to approach the issue, that is.