GISSTemp: Declined Since June 2009

GISSTemp dropped from 0.64C in June to 0.60 in July, 2009:

Figure 1: GISS Temp, trends since 1990 and 2001. (Filled symbols represent data after 2001; open symbols are prior to 2001.)
Figure 1: GISS Temp, trends since 1990 and 2001.

The servers at Dreamhost was so surprised by the slight drop in GISSTemp, it went went down too. Here’s the skinny:

  1. This is the 2nd highest July anomaly recorded; July 1998 hit 0.70C. (For those who want to compare to 1997, July 1997 was 0.25C. All July values are circled.)
  2. Despite recent increase of temperature thought to be associated with the onset of El Nino, the trend since 2001 remains negative. A few months of anomalies over 0.50C should finally kick this trend into positive territory. (I suspect that will happen before Christmas, and could happen next month).
  3. I decided to show trends since 1990 (since that graph shows the 1998 El Nino.) The trend since Jan 1990 is 0.18 C/decade; the multi-model mean trend for the models used in the AR(4) is 0.29C/decade. So, for that periods, the projected least squares trend based on the multi-model mean was higher than the “about 2C/century”; this results from the Pinatubo dip in the early portion of the 90s.
  4. The ±95% confidence intervals on the least squares trend since 1990 corrected for AR(1) correlation are shown. You get to notice whether the multi-model mean is inside or outside those uncertainty intervals. (Note: Caveats apply; it’s complicated.)

17 thoughts on “GISSTemp: Declined Since June 2009”

  1. Is it my display or are there a large number of shades of red (pink?) on the graphic? If it’s not my display, I would find it easier on the eyes with a bit more color contrast for the various trend lines.

    bob

  2. Schnoerkelman,
    The models means are red(volcano) and pink (all).
    The GISSTemp data are blue.

    When I add Hadley and NOAA they are other colors.
    So, this version is mostly pink/red.

    Would it be easier if I made the background lighter? (Also, clicking to make larger might help a little. The full image is twice as wide and tall.)

  3. Lucia,
    Is it possible that the effects of this El Nino went barn burner at the outset only to moderate quite quickly. You have noted that lower troposphere temps have come down. Is it possible that the effects of this El Nino have “plateaued” [I’ll use that since the “peaked” I used yesterday caused confusion], if so why would you expect anomalies to continue to exceed 0.50C? Wouldn’t that only be the case if this El Nino were on par with the 1998 one?

  4. Lucia,

    I think you mean 0.18c/decade and not per year. Otherwise it would be awefully hot around here at the end of the century (18C increase…).

  5. GISS would need an average of 0.74 for the next 5 months to match their 2005 annual record of 0.62. That is very unlikely, even for GISS. The highest they ever hit was 0.71 in October 2005.

    I remember that year well. Here in Greater Toronto, it was a beautiful autumn, and the Markham Fair http://markhamfair.ca/ around the first weekend of October was staged in daytime temperatures in the low to mid-20’s. I worked up a good tan the day I went there, and walked around all day in shirt sleeves… Ah yesh sonny boy, I remember the good ole days of climate warming. It’s a gol durn shame that you won’t be able to experience walking around in shirt sleeves at the fair.

  6. Lucia, yes I was on the larger image. I may be the only one, but it was difficult for me to follow the lines: on my screen the colors are too similar to be easily distinguished. I can detect the difference on the legend if I look carefully but I lose it on the chart. I am not color blind but I can imagine those with even slightly limited color perception would find them all the same color. Perhaps a “piggy pink” for one set would be helpful?

    No real criticism here, just a friendly suggestion.
    Speaking of which, did you notice the suggestion from Steve M. on the BS09 thread that you emulate his turnkey R scripts so that the data is auto-magically fetched?
    “I wish Lucia would do this; it would be very consistent with her approach. On many occasions, she writes good posts where I’d like to handle the data right then, but don’t want to spend an hour locating the data, remembering how to extract it and doing the analysis.”
    I think this is a good idea as well, not that it matter what I think 🙂

    Keep up the good work!
    bob

  7. Hi Lucia,

    With this new GISTEMP data release, lots of changes have ocurred. There is a generalised warming of the past (until about 1940) and a generalised cooling of the last decade. In total, 91 past monthly records have warmed (most from long ago), and 39 past monthly records have cooled (most within the last decade).

    2007 is still the second hottest year on record, but after cooling 0.01C it is now tied with 1998 in that position. 2002 and 2003 have also officially cooled 0.01C.

    The overall 1880-2008 warming trend has cooled from 0.56234C/100y to 0.56072C/100y. The 2001-2008 trend has also cooled from -0.1141 to -0.1409. If 2009 stays at its currently high 0.5285C average temperature anomaly, the 2001-2009 trend will still be negative by the end of the year.

  8. bob–
    I keep planning to learn R, but I haven’t. So, I fetch manually.

    Nylo– I didn’t look at the past. Now I’ll have too!

  9. Tufte recommends using blue/yellow for contrasts. Red/green is invisible to color-blind people, nearly all of whom can see blue/yellow.

  10. Lucia, one of the most useful attributes of R is to instantly create a matrix from a url.

    If you do something like
    loc=”http://www.site.org/lucia/filename.dat” the name of the file
    and then do
    X=read.table(loc). Or use read.fwf(loc, widths=c( 5,6,7,4,3),fill=TRUE)

    IT will come as a data.frame.

    you have the data in a nice handy place for analysis. In a given situation, you need to tailor it a- maybe skip some lines. But it’s WAY easier than manual downloads.

  11. Steve–
    Thanks.

    Anyway, I know R is great. Still, you need to realize that knowing the command to download gets me about 1% of the way to where you would like me to be. In any case, to create the turn key scripts you wish I would create will certainly take time, and I admit my interest have been for answering specific questions that interest me. For the most part, I can either do that using tools I am familiar with or learn R. So… I do what interests me.

    Manual downloads take 30 seconds. Clicking the R application, and typing in that command takes me about the same time. I could write a php file to download the data and store it where I want. That’s just not the time sink.

  12. Lucia, I think Steve’s interest in having turnkey R code from you is so that he (or anyone else) can “play” with what you’ve done without the hassle of manually finding and then downloading the data. With the turnkey version it “just works”.
    .
    You do lots of interesting things that others here would like play around with to just to see what happens.
    .
    Note that this isn’t intended as criticism or badgering, just an attempt to explain. Please forgive if that was already clear to you and I shall now return to lurking where I belong.

    bob

  13. schoerkelman–
    I understand Steve’s interest. I even think that’s a good idea. But… I don’t know R at all.

  14. R is certainly a good tool that I have never used, but will try some day. So far I am using excel to do the same thing. It works very well for this task, I only need to open the file in my web browser, select the lines with data, CTRL+C, select the appropiate cell in my excel sheet and paste as “text only”. I get immediate calculation of trend changes, and can visually identify the changed records and see the magnitude of the changes compared to the previous version. But that is, of course, after properly setting up the excel sheet with some rather complex formulas. The good thing is that I don’t need to modify any formula from one month to the next one, so the complex part only had to be done once…

Comments are closed.