While visiting Climate Audit I clicked a link to web page that reminded me that Chapter 9 of the WG1 contribution to the IPCC AR4 used the 1901-1950 baseline to illustrate how well models hindcast the 20th century:
The AR4 was published in 2007, so data end with the annual average data for 2005. Note the red is the multi-model mean; the yellow are the individual runs. Black is HadCrut. Before proceeding, also note that all projected temperature anomalies above 1C are cut off on that graph. Those glancing quickly might infer the 1998 El Nino was near the maximum temperature anomaly projected by the collection of runs.
My first thought when I saw this was “Hmmm… I wonder how that graphs would look if we added data from 2006 onward?” So, naturally I did that. Voila:

I made a few changes to Figure 9.5, including: I
- Cut out data before 1950 to permit better focus on the comparison outside the baseline period. (The baseline method forces agreement of the average temperature anomaly during the baseline period.)
- Used different colors for each run and including ±95% bounds for all temperature anomalies in all runs, computed assuming anomalies are normally distributed.
- Adjusted the vertical axis to display the full range of model runs include anomalies greater than 1C. Notice that the 1998 El Nino no longer seems particularly high compared to the model runs? Interestingly, if you click to see the larger graph you’ll see that GISStemp barely pierced the multi-model mean projection during that “super El Nino”.
- Added 12 month averages through the month ending March 2009. I think this shifts my graph 11 months relative to the graph in the AR4, because I think the AR4 ‘2005’ corresponds to Jan 1, 2005-Dec 2005. I’ve indicated where data for the 12 months from Jan 2005- Dec 2005 with a vertical blue line.
- Included a few extra runs available at The Climate Explorer and excluded the runs with no projections based on the A1B scenario. I did not screen for “excess model drift”; the caption for figure 9.5 indicates the authors of the AR4 screened for this.
I’m not sure anyone can conclude much from this graph, but I find it interesting to periodically update the figures periodically to see how things are going. Right now: The using this baseline model projections lie above the observations.
This reminds me of the soldier who wanted out of combat so bad that he tried to shoot himself in the food and missed 40 times
Hi Lucia,
your graph doesn’t match the IPCC one very closely – why? Just looking around the 1998 El Nino peak, there are the following clear differences:
* IPCC peak of observed data (black curve) is very close to 0.9; your peak (black curve) is maybe 0.84.
* In the IPCC curve, the first model data point over 0.9 is in 1995. In your graph, the first model point over 0.9 is 1986 or 87 and you’re already hitting 1.2 not long after 1995.
* After the 1998 peak, the IPCC curve also shows observed temperatures above the model mean from 2001 through 2005 (looks like). Your curve shows observed temperatures below the model mean after 1998.
Why the differences? You picked a different collection of models to compare against? Why?
MikeC: Food full of bullets isn’t very tasty! That would put you in the hospital.
Lucia-does your graph exclude models without volcanoes? Because the agreement of your model mean with the eruption of Pinatubo is better than the AR4 graph, which I take to mean that noise from volcano-free models is damping the volcano response.
Andrew_FL: It’s OK, he missed his food, so he won’t have to suffer hospital food.
Arthur & Andrew–
I noticed those differences and tried rereading the caption for the AR4 over and over again. I don’t know why it’s not a perfect match. The main feature you can see is I shown the models slightly above the data at the end of 2005 and the IPCC shows it below. The other issue is my graph captures the max/min for volcanoes better.
So, first for the max/min issue:
1) I compute the 12 month average each month, so my graph is smoother. You can see the spikyness in the IPCC graph which I think arises from connecting data at year points only. The smoother graph catches the max and min after eruptions better (while still being smoother.) But this is not the cause of the issues you, Arthur, are discussing. (It affects the answer to Andrew’s question.)
Now for the perplexing shift. For that, we look to the caption in the AR4:
The supplemantary material says this:
Figure 9.5
So, now going to FAQ 92, figure 1:
Returning to Appendix 9.C…
Step 4 seems to involve computing the temperature in cells with HadCrut measurements only. So, possibly leaving out the poles and the empty area in Africa. Step 6 Deseasonalizes with 1960-1990 calender months. (I used 1901-1950 calendar months) Step 8 computes annual averages for grid cells with at least 6 months of data.
I admit to not doing all that. I just compare HadCrut and GISSTemp to the models runs over the entire planet. So, we can get differences for a number of reasons, but I suspect the three main contenders are:
* I don’t have ready access to anything to test “excess drift”, so I didn’t throw out models with excess drift. This could affect the baseline periods in particular, and shift things a bit.
* I use 22 models, not 14 mentioned in the caption for the IPCC figure. The IPCC projections use all models, but figure 9.5 does not. The smaller number used for the IPCC figure 95 may be due to it’s being created before all the modeling groups reported results, or ‘climate model drift’ or both. But dropping out 1/3rd of the models could affect the averages as much as shown.
* I did not download all the PCMDI data, and do complicated masking to create my own monthly model runs. (Chad has been doing that at another blog.) I used monthly data from The Climate Explorer. So, this could make a difference if the models are predicting more warming in those regions.
That figure (your Figure 1) has always meant to me that the models don’t do a good job in the 1900-1950 period. The red line simply misses the fact that there was a good deal of heating from ~1910 to ~1945.
Well, today I was looking for some historical data on methane and N2O and came across this site.
http://www.esr.org/outreach/climate_change/mans_impact/man1.html
Here, they show graphs from the 2001 IPCC report (scroll to the bottom). Am I reading these right? Do they show almost a perfect match for the entire 20th century? If so, why did the models appear to get “worse” from 2001 to 2007, where the 2007 “anthropogenic + natural” seems (at least to me) to be a much poorer match in the early 20th century than the 2001 graph?
JohnM–
Those graphs are TAR model data. I don’t have access to that model data, so I can’t confirm or deny it’s right.
Generally, some of the disagreement around the 40s-50s might be due to the “buckets issue”. Gavin discussed that here. Some of it could be “weather noise” (but to the extent t, then it’s not the buckets. 🙂 )
The models seem to not have caught the run up in the 20s-30s. Why? Beats me.
I think you can get some simulations from TAR on climate explorer. Go to the bottom of the “monthly scenario runs” page.
“So, this could make a difference if the models are predicting more warming in those regions.”
In fact, the Arctic in particular is known to have more predicted warming than any other region on the planet.
Marcus-
Yes. Chad computed the monthly data for one model over grid points used by HadCrut here. He didn’t subtract to show the difference between the model projections based on the entire planet vs. the HadCrut grid. But that could be interesting. We could ask him.
Until a few months ago I was covinced in the truth behind global warming. I read what I could about it and the more I read the less sense it made.
What did make a little sense was the Vostok ice core data which of course shows the 100,000 year glaciation 10,000 year interglaciation period. If history repeats we should be coming out of the interglaciation and heading into the glaciation period.
The work of Landscheidt, Fairbridge and Livingston-Penn shows we are headed into a deep solar minimum. In previous tmes this heralded a severe cooling trend on at least the northern hemisphere. Landscheidt and Fairbridge both died before their predictions, their predictions seem to be coming true in a big way. If they are correct, the status of the USA and Canada as the breadbasket of the word is in jeopardy at a time when the USA grainstocks are the lowest since the 1940’s. This is very serious business.
I did a quick calculation and found the trend (2001-present) for HadCM3 with complete coverage and with coverage matching HadCRUT were virtually identical. Here are the results:
Full coverage: 0.0457 ± 0.0258 °C/year (5-95%)
HadCRUT coverage: 0.0427 ± 0.0160 °C/year (5-95%)
It does decrease the slope, but not by a huge amount. This is just for one model of course, so the net effect when averaging across all the models might be large.
Please remind us it is not proper to mine for a model that fits the current decade 🙂
nanodrv7,
I’m sorry to say that you may be a bit confused on ice age mechanisms. Ice ages are primarily caused by changes in the Earth’s orbit and axial tilt known as Milankovitch Cycles. These are easy to predict given our current knowledge of orbital mechanics, and it is highly unlikely that the current interglacial would naturally end in the next 23,000 if not 50,000 years.
.
See:
http://www.sciencemag.org/cgi/content/summary/297/5585/1287
and
http://www.sciencemag.org/cgi/content/abstract/207/4434/943
on this.
.
There is also the interesting fact that the magnitude of current anthropogenic perturbations exceeds that of Milankovitch forcings, which suggests that the feedbacks necessary to trigger an ice age would be unlikely given current atmospheric GHG concentrations. That said, by the time Milankovitch Cycles lead back towards ice age conditions, there is no way to know what GHG concentrations will be because the timescales involved are geological and span millennia.
Zeke: “There is also the interesting fact that the magnitude of current anthropogenic perturbations exceeds that of Milankovitch forcings”.
Zeke, you should add “according to an as yet unproven theory, the evidence for which is missing in observational data.”
Peter,
.
The spectral properties of various gases in the atmosphere are pretty well proven through basic laboratory experiments. Their direct additional forcing (independent of any feedbacks) from additional greenhouse gas concentrations will be comparable to the variation in the annual magnitude of Milankovitch forcings assuming emissions continue over the next century. Since most of the same feedbacks apply to any external climate forcing (be it GHGs or Milankovitch forcings), its pretty hard to argue that the uncertainties in climate sensitivity due to feedbacks have any bearing on my statement vis-a-vis the magnitude of GHG and Milankovitch forcings.
.
Now, if you want to argue that higher concentrations of GHGs have no direct radiative forcing, well, I’d suggest grabbing a textbook on radiative physics.
“pretty well proven”
Zeke,
How well, sir?
Can you quantify “pretty” for me? Is that better than ‘kinda’ proven?
Thanks so much. 😉
Andrew
Lucia, I think that I can explain the difference between 22 and 14. Some of the 20CEN models include volcanic forcing (e.g. GISS EH, GISS ER) while others do not (GISS AOM). If you parse explanation in AR4, Fig 9.5 is from the volcanic models which have notches for the major volcanoes. You can remove some of the smearing by restricting the population. I’ve added this information into my Info file on 20CEN models, which I’ll post up.
I’d done an emulation of Fig 9.5 on this basis and got pretty close – it must have been in the air as I did this independently of reading your post, but haven’t posted on it yet. 🙂
Andrew,
While every now and than an Einstein comes along to overturn (or at least strongly nuance) Newton’s laws, there is strong evidence from both laboratory experiments and satellite measurements that greenhouse gases absorb certain bands of longwave radiation, and more greenhouse gases would absorb more longwave radiations. Of all the controversial issues in climate science, the basic radiative physics of greenhouse gasses is perhaps the least uncertain. See http://www.wag.caltech.edu/home/jang/genchem/infrared.htm and http://tinyurl.com/r86bn5 for a basic textbook treatment. The HITRAN database is also useful: http://cfa-www.harvard.edu/hitran/Download/HITRAN04paper.pdf
Let’s see if I can comment …. (I’ve been having trouble. GRRR)
Steve– I’m not sure it’s entirely the volcano issue. The graph does look like the 14 models must over lap the volcano models. BUT, according to the AR4, CGCM3.1 includes volcanoes, and adding that would make Figure 9.5 not look like the one in the AR4. So…. I don’t know.
Mind you, CGCM3.1 model runs don’t look like they includes volcanoes; but the AR4 says they do.
There is also the issue that HadCrut’s 1998 does not end up at 0.9C using the 1901-1950 baseline.
Zeke-“the magnitude of current anthropogenic perturbations exceeds that of Milankovitch forcings”
Well, yeah, almost certainly-net radiation changes from orbital forcings is miniscule. Makes you wonder how the heck they actually manage to cause the glaciation cycles! Of course, I could speculate but I prefer to let everyone figure out for themselves what the answer might be.
That being said, that BIG feedbacks are necessary is not necessarily the case:
http://www.worldclimatereport.com/index.php/2008/02/21/global-warming-not-so-fast/
Andrew_FL,
I’ll agree that the real debate lies on the climate sensitivity side, given the manifest uncertainties associated with various feedbacks and forcings (e.g. aerosols). I should also correct my earlier statement a bit to say that â€the magnitude of anthropogenic perturbations by the end of the century will likely exceed that of annual average Milankovitch forcingsâ€, since seasonal effects of Milankovitch forcings can be quite strong even if the net annual radiation budget only changes slightly.
Somehow …. spell check allowed the soldier to shoot himself in the food instead of the foot… but since he missed all 40 times… i guess it dont be a matterin’
The spatial ~distribution~ changes also. Which has important, but not well understood, effects…
Zeke [13502]
You write: “the radiative physics of greenhouse gases is perhaps the least uncertain”. I don’t think so. It is in fact one of the most controversial elements in the AGW/ACC argument because one of the inconvenient shortcomings in the GHG story line is that the physics hold that the process operates on a log scale.
The way I’ve explained this to some “ordinary folks”, is that when you paint your room a new colour, coats 1, 2 and 3 will make all the difference: by the time you apply coats 4 and 5 you’re wasting your money in terms of a meaningful, observable difference. Same goes for GHGs.
Zeke,
What interests me, in this case, is your use of language. I’ve been down the Road of Expanding References already, thank you. To the point, you didn’t answer my question. Can you post some numbers that support your conclusion of ‘pretty much’ proof? A nutshell-length summary response with the numbers that count will suffice. In your latest reply, you went to the Well of Subjective Terms again with ‘strong evidence’ and ‘least uncertain’, BTW. Can you do better than that?
Andrew
tetris:
How is the fact that radiative forcing from GHGs tends to follow a log scale problematic to the explanation? The standard equations involved are:
CO2: ΔF = 5.35ln(C/C0)
CH4: ΔF= 0.036( M – M0) – (f(M,N0) – f(M0,N0))
N2O: ΔF= 0.12( N – N0) – (f(M0,N) – f(M0,N0))
Where:
f(M,N) = 0.47 ln[1+2.01×10-5 (MN)0.75+5.31×10-15 M(MN)1.52]
Or, to put it another way, for every doubling of CO2 we get ~3.7 wm^-2 increase in forcing. This simple relationship doesn’t necessarily imply that that saturation occurs early. Just play around with MODTRAN and you can see this easily: http://www.flickr.com/photos/24797698@N03/3330404893/sizes/o/
Er tetris-What? GHE is uncertain because it is logarithmic? Run that by me again…
Andrew_KY,
The problem is that radiative transfer physics, while well established, are not exactly simple. The Schwarzschild equation forms the basis of our understanding of radiative transfer. Interestingly enough, two outspoken climate skeptics (David Bellamy and Jack Barrett) provide one of the more lucid (though still mathy) explanations of the application to modeling GHG forcings: http://www.barrettbellamyclimate.com/page45.htm
.
Similarly, Realclimate has one of the better basic explanations. See steps one through four of http://www.realclimate.org/index.php/archives/2007/08/the-co2-problem-in-6-easy-steps/
.
http://www.agu.org/pubs/crossref/1995/95JD01386.shtml might also be useful as a primary source based on satellite measurements.
Zeke (13502) you managed to get both Newton and Einstein in the one comment thats outstanding.
For me it is not a disagreement with the simple lab physics, these physics make sense all else being equal. If the world was a closed box in a lab then sure the models would be spot on. But the models (and hence the theory) are looking shaky suggesting there are unknowns (both known and maybe unknown) — other forcings and feedbacks.
Andrew Kennett
Zeke,
I appreciate your patience. Do I need to be a physicist to truly understand why I’m supposed to believe in AGW? Should a person who would never touch science with a zillion-foot pole go ahead and believe in something without understanding it? What do you think?
Andrew
Andrew Kennett,
Indeed, the devil is in the feedbacks (well, and aerosols and ocean heat transfer rates). Lucia, Spencer, Lindzen, Gavin, Santer, and Hansen can all (mostly) agree that, all things being equal, more GHGs would lead to higher temperatures. The real question is, given all the uncertainties in how the initial GHG forcing interacts with the climate system, will we end up with 1 degree warming or 4 degrees per doubling of CO2? There is a whole interesting literature out there on estimating climate sensitivity. While there are a number of proposed probability density functions for temperature responses (e.g. http://i81.photobucket.com/albums/j237/hausfath/Picture53.png), there is quite a wide range of estimates.
.
Andrew_KY,
Sometimes you have to touch things to understand them. For a system as complicated as the Earth, you can provide simple heuristics and metaphors (the atmosphere as a greenhouse, for example), but none of these will ever be complete and immune to simple criticisms that the metaphor is problematic.
Well I for one think that since just about everyone agrees on the basic physics issues, we can be pretty confident in it. Lubos Motl and Roy Spencer both agree on it-this is one thing that AGW advocates have more or less correct:
http://www.drroyspencer.com/2009/04/in-defense-of-the-greenhouse-effect/
http://motls.blogspot.com/2009/04/myth-believed-by-some-agw-skeptics.html
Zeke: “There is also the interesting fact that the magnitude of current anthropogenic perturbations exceeds that of Milankovitch forcings, which suggests that the feedbacks necessary to trigger an ice age would be unlikely given current atmospheric GHG concentrations.”
Sound definitive to me.
Zeke: ” The spectral properties of various gases in the atmosphere are pretty well proven through basic laboratory experiments. Their direct additional forcing (independent of any feedbacks) from additional greenhouse gas concentrations will be comparable to the variation in the annual magnitude of Milankovitch forcings assuming emissions continue over the next century.”
This post has enough wiggle words to make a politician blush…….which is it Zeke?
Zeke my understanding of radiative transfer comes from my sister who is a published peer-reviewed physical chemist, trust me when I say she knows wherof she speaks. As an aside, I now know why warmists love to appeal to authority, its so much easier than advancing a well reasoned argument. Enough of that.
Zeke, here’s the thing. I have no issue with the radiative transfer properties of CO2, but it is obvious that the climate system is simply not willing to behave according to the theory, and over any relevant long timescales never has. The rate of increase of GHG concentration results in a linear forcing. Over the last few years, the atmosphere hasn’t warmed. This only leaves the oceans to store the increased energy of the climate system, and they’re not warming, according to the Argo data. Every time I hear somebody say there is warming in the pipeline I want to beat them over the head with a calorimeter.
Andrew_FL [13519] and Zeke [13518] and Peter [13544]
Andrew, thx for the links. That said, you should take the time to read what Spencer actually writes in the paper you include. Not only that, you should really see his essay dated May 11th on his website. And Zeke, you should too, before you start invoking Spencer, Lindzen, etc. as somehow supporting your position. Spencer certainly doesn’t. And Spencer and Lindzen are not on the same page as Hansen.
I couldn’t agree more with Peter’s comments. As we all know, the scientific method teaches that when data does not support the hypothesis, you modify the hypothesis or scrap it all together. Not the data, as some -GISS and certain members of the Team come to mind- have demonstrably done.
For close to a decade now, ALL meaningful temperature data show no warming and as Peter points out there is no evidence either of calorie transfer from one medium to another [i.e. the warming is not “hiding” in the oceans]. ALL GCMs are off, as a rule by a significant margin. Since they can be tweaked to suit the desired result, they’re the ultimate form of confirmational bias, which leads to the “there’s more warming in the pipeline” trailer: “don’t miss next week’s episode, same time, same channel”.
Peter and Tetris:
.
The radiative properties of CO2 and other GHGs have relatively little bearing on recent temperatures since the magnitude of the climate response is dominated by climate feedbacks (where considerable uncertainty exists) and (in the short-term) natural variability. I may be putting words in the mouths of Spencer and Lindzen, but I have yet to see anything by them that suggests that the radiative forcing of doubling CO2 (in isolation from any other factors) would not increase radiative forcing by the ~3.7 watts per square meter suggested by the radiative properties of the gas.
.
In other words, you can be as skeptical as you want about climate sensitivity and the magnitude of feedbacks, aerosol forcings, ocean heat transfer rates, and natural forcings in GCMs while still accepting the physics on the radiative properties of GHGs. Its always useful for debates like these to identify areas where we mostly agree, and areas where serious uncertainties remain.
Zeke,
Knowledge of the radiative properties of gases is of little practical use until you know the exact vertical composition, distribution and temperature profile. You also need to know the surface temperature before you can start to talk of upward or downward irradiance at various altitudes. All these factors vary in space and time.
Radiation calculations do not determine temperatures. It is quite the other way round as it is the temperatures that determine radiation. This leads to the conclusion that the calculated shortfall in upward irradiance depends entirely on estimates for temperature, mass and composition profiles at different times and places.
It is presumably because these things can vary that the current situation of zero shortfall has occurred.
If I were a policymaker there is another question I would like the climate experts to answer. What level of CO2 in ppm is needed to preserve the current temperatures. Clearly it must be less than the existing value as we are assured there is more temperature rise in the pipline and must be greater than the preindustrial level which would make us too cold.
I would then ask what amount of human produced CO2 in Gt per year would be compatible with maintaining that value – after I operate my magic sequestrator. The range of answers to those questions would be very interesting in determining just how much the science is settled.
Zeke, that is fair enough. The radiative properties are not often in dispute in places I read. As far as feedbacks, considerable uncertainty actually translates to not a freaking clue. Then we get to my favourite, natural variability. I thinks this translates to chaotic. Tom Vonk says that if a system is chaotic, then that is all she wrote, trying to find parts of it that are deterministic is just not on. When he speaks, I give him the E. F Hutton treatment.
Peter [13569]
I have to assume that by chaotic you mean non-linear. The earth’s climate systems are “chaotic” in that sense only, because based on evidence and best understanding are inherently self-regulating and stable. Which brings us back to natural variability.
As far as the role of feed backs are concerned, Hansen’s recent comment that the IPCC’s aerosol data “was pretty much pulled out of a hat”, says it all.
tetris (Comment#13572) May 13th, 2009 at 12:29 pm ,
Stable? I guess that depends on your definition of stable. Natural variability? How is that different from chaotic behavior? A bounded system, which is a little closer to the actual behavior of weather on whatever time scale, can still be chaotic. Planetary orbits are chaotic on time scales exceeding 5 million years. That doesn’t mean Velikovsky was correct about the origin of Venus. What it does mean is that it is not possible to predict the position of the Earth relative to the sun and the other planets with accuracy 10,000,000 years from now even if you had a computer with an infinite bit CPU to avoid round off errors, used infinitesimally small time steps in the numerical calculations and could specify the current positions and velocities with precisely zero error.
Tetris, what DeWitt said. I meant chaotic as in chaotic. Chaotic as in can’t be modelled, can’t be described by equations and then solved, numerically or otherwise, chaotic. Not non-linear.
DeWitt Payne [13574]
A bounded system, like weather/climate on any given time scale can certainly be chaotic. Based on the geological record, the natural variability of the global climate has temperature variations and boundaries that in the absolute do not lie very far apart, which would indicate that the system as a whole does not tend to chaotic extremes in terms that variable. Interestingly enough, during these variations in temperature, another variable, CO2, fluctuated widely to much higher levels than witnessed today. Unfortunately, on a geological scale the global climate system varies around a temperature mean that is not conducive to the well being of plants, mammals and other animal life, for as David Deming recently pointed out in The American Thinker, 90% of the past 1 million years have been an ice age.
Peter [13577]
Based on that definition, where does that leave Poincarre, Lorenz, et. al and the chaos theory?
Maybe that’s what you mean, but not what I mean. You can write the differential equations for the solar system but there is no analytic solution. In fact, there is no analytic solution for a gravitationally bounded system with more than two bodies. That doesn’t mean you can’t calculate orbits to high precision in the short term using numerical methods. I’m paraphrasing Tom Vonk from memory here so I may get it wrong, but IIRC you can’t mathematically prove a system is chaotic unless you can write the differential equations and determing the Lyupanov coefficients. If any are positive, the system is chaotic on some time scale. It is also completely deterministic. There is nothing random about it. However, even an analytic solution can only be projected so far into the future because the initial values are imprecise. That, strictly speaking, is not chaos.
DeWitt Payne [13586]
Goes to my question to Peter [ref: 13579] about [more in particular] Lorenz’ work in applying chaos theory to weather/atmospherics.
A bit late to ask I’m afraid but would it be possible to extend your figure back to 1900 or even earlier. The IPCC figure 9.5a and the plots of modelled and observed temperature at http://www.climatedata.info all suggest the models did not do well in the 20th century until CO2 is assumed to kick in around 1975.
Julius,
No, the models did not do well in the first part of the 20th century.
lucia (Comment#13718)
If the models did not do well when CO2 had less of an effect, then wouldn’t that lead to the conclusion that the present agreement is just as likely to be coincidence as not?