Issues and Resolutions:
My recent falsification of the IPCC AR4 projection of 2C/century was criticized as flawed because:
- It did not properly account for ENSO.
- Cochrane-Orcutt, and its uncertainty bands apply only when the noise is AR(1).
I have addressed both issues (details below), and find the falsification still holds. It is important to note that this falsification does not overturn the more general result of warming. It only addresses these two questions: Did the IPCC recent AR4 prediction/projections correctly estimate the magnitude of warming? Did the IPCC correctly communicate the uncertainty in their estimate of the central tendency based on their hierarchy of models?
I think the answer to both question is no. However, the goal of my current post is to relate the results of incorporating ENSO into the Cochrane-Orcutt fit, and to discuss the tests on the residuals. In the analysis using the average of 5 data sets and accounting for ENSO, I find the residuals appear to be free of AR(2) noise.
This means: The IPCC AR4 near term prediction of 2C/century remains falsified.
The discussion of the results of accounting for ENSO and analysis of the serial autocorrelations is below, which you can consider as sort of “appendices”. 🙂
Accounting for the effect of ENSO
I followed Roger W Cohen’s suggestion and added the MEI index to the regression for Global Mean Surface Temperature as a function of time; this data set is available from NOAA MEI data. As in previous analyses, my baseline analysis uses the average temperature anomaly from 5 measuring agencies: GISS Land Ocean, UHA MSU, NOAA, RSS, and HadCrut.
My results were similar to those reported by Roger Cohen who applied Ordinary Least Squares (OLS). Roger noted the maximum correlation between the MEI index and the temperature occurs when temperature lags MEI by two months. I confirmed this, and used the 2 month lag in analysis. Because the residuals from OLS showed serial autocorrelation, I corrected for this using Cochrane-Orcutt.
Results:
Accounting for the effect of ENSO by introducing the MEI index into the regression for temperature, the best estimate for trend of m=-0.5 C/century. the 95% confidence intervals are -2.0C/century < m < 1.1 C/century. The IPCC AR4 projection of of +2.0C/century is rejected. In contrast, not accounting for the MEI index had resulted in m=-1.0 C/century with 95% confidence intervals of -3.0 C/century < m < 1.0 C/century. So, using the current data, the negative temperature trend with time remains, though the magnitude is smaller.
The result is illustrated graphically below:

Click for larger
Note the IPCC near term prediction of 2C/century falls, illustrated in gold, falls well outside the 95% confidence intervals for trends that might be consistent with the weather data. It is true that the time period is short. For that reason, the confidence intervals on the true trend are very wide. Nevertheless, the predicted trend falls well outside those bands.
Recall, the IPCC projection also falls outside the intervals for the trend when ENSO is not considered explicitly in the regression. So, even though the specific magnitude of the trend varies depending on the precise analytical method, this falsification is robust to including or not including ENSO in the hypothesis test.
Testing for serial residuals
In comments, Tamino recently said that Cochrane-Orcutt does give correct uncertainty intervals when the noise is AR(1). However, he has been under the impression the noise is not AR(1). However, we shall see that with this most recent analysis and corrected data from HadCrut and GISS the data are free of “other than AR(1) noise,” and appear to be entirely consistent with AR(1) noise only.
As some readers are aware, Ordinary Least Squares (OLS) is suitable when the residuals of the fit are are uncorrelated. In contrast, Cochrane-Orcutt is a suitable technique when the residuals for fit are autocorrelated, and the form of the autocorrelation is “red”. This is also called “AR(1)” noise. (When the residuals are uncorrelated, Cochrane-Orcutt become OLS!)
Because the residuals after fitting using OLS exhibited serial autocorrelation, I used CO to account for this issue. When doing a fit CO fit, it is useful to check the residuals after fitting to determine whether they are “AR(1)”.
The hypothesis of AR(1) noise is supported by three tests: eyeballing the lagged trends, plotting the correlogram and performing the Ljung-Box Test using the methodology described at NIST.
Lag Plots
Lag plots are a visual method for testing for autocorrelation, or “interesting features” in residuals of data. To create the plot from a time series, you plot the temperature at time (ti) on the horizontal axis and the temperature some at time (ti+k) on the vertical axis. I plotted these for lags of “k=1” in red and “k=2 in yellow”. (So, when K=1, the temperature for Jan 2001, is paired with that for Feb 2001, and so on. For k=2, Jan 2001 is paired with March 2001, and so on.)
To the left, you will see the red and yellow dots look like someone took a shotgun aimed at the center and blasted away; (click image for larger version).
Believe it or not, the horrible mess means the data are uncorrelated and the Cochrane-Orcutt method “passed”. 🙂 You can read more about lag plots and see the sorts of plots taht would “fail” at Nists pages on lag plots (1) and (2).
Correlation Plot
There is a second visual way to identify autocorrelation in lagged residuals. This involves calculating the autocorrelation as a function of lag, plotting and comparing the magnitude to the 95% confidence intervals for autocorrelations; the method is explained at NIST. Autocorrelations may pierce the 95% confidence intervals at a rate of 1 every 20 lags (as that is the meaning of the 95% confidence intervals.)
The correlations as a function of lag are illustrated to the left (click image for larger version.) For the number of data in this sample, the 95% confidence interval is 0.21. Of course, the correlation at zero lag is 1 (as required by the definitoin.) However, at all other lags, the correlation is much lower, and as you can see, 2 out of 50 autocorrelations did pierce the 95% confidence intervals.
Since the data passed the first two tests for serial autocorrelation, I further quantified the serial autocorrelation by applying the specific lag Ljung-Box Test for lags up to 50 months, and found the residuals uncorrelated.
So, why did the “other than AR(1) noise” go away?
Some readers are aware that residuals from the CO fit from my first analysis did exhibit just enough non-AR(1) noise to break the 95% confidence limit threshold, so there could be difficulties applying CO, which requires non-AR(1) to vanish. So, one might wonder, why is it gone now?
The reason this AR(2) vanished in this new analysis three fold. Most would assume main reason for the difference is the addition of the MEI index which accounts for the correlations in temperature due to persistent weather events like El Nino. Accounting for ENSO did change the character of the residuals, but oddly enough, two other factors were also important:
- Some of the temperature data values in file I obtained at Anthony Watts’ blog differ slightly from current values at GISS, Hadley and UAH MSU. Though those agency pages do not indicate recent revisions. I found the data from July 2007-Jan 2008 differed bewteen the file at Watts and Hadcrut, the Jan 2008 value differed at UAH MSU and the values from Jan 2007-April 2007 differed at GISS Land/Ocean. (I had checked Anthony’s values briefly in February, and I believe his values matched the agencies at the time. I am now including time stamps on my data, and highlighting modified cells in my EXCEL file.) When I modified the files to include the updates, the “other than AR(1)” noise was reduced.
- Measurement noise in data is likely white, not AR(1). (AR(1) noise is called “red”). The reason measurement noise is likely to be white is that mis-measurement at one time is generally unrelated to mis-measurement at another time is likely to be random. Different instruments go bad at different times, in different places and the net result is the actual error in measurements in January is unlikely to be strongly related to errors in February.
In contrast, there are some physical reasons to expect the weather noise to contain a sizeable AR(1) component: the atmosphere and ocean have thermal mass. Weather patterns persist. This means that one would expect the weather data to contain both AR(1) and white noise. (Weather may also contain other types of noise.)
When data containing both AR(1) and white noise is analyzed using Cochrane-Orcutt, it’s possible to show, mathematically, that the “white” measurement noise will result in non-white residuals for the Cochrane Orcutt fit.
Recall that in a previous blog post, I mentioned that averaging over all reliable data sets will tend to reduce the measurement noise. This tend to drive the “white” measurement noise to zero, while leaving weather noise unaffected. If the weather noise is, indeed, AR(1), or has small non-AR(1) components, averaging over a set of different measurement groups data will result in a data set that can be analyzed using Cochrane-Orcutt.
So, once again, we see that using the ensemble average over all available measurements results in better results, and smaller uncertainty intervals than using only one set at a time.
Of course, if you prefer to do a crude analysis, you can refuse to average, retain the maximum amount of noise and do the crudest possible analysis you can think of. This will always result in the larges error bars anyone can dream of and you will “fail to falsify” untrue results. If you go so far as to claim this confirms the result, well.. What can I say>
Accounting for uncertainty in the correlation.
There is still an open question: In my first post discussing the falsification of the IPCC’s 2C/century prediction, I mentioned that uncertainty in the autocorrelation could result in larger uncertainty intervals.
So, how does uncertainty in the correlation coefficient affect the results? I do not yet know the formal way of estimating this. So, instead I did the following:
- I estimated the 95% uncertainty interval in the 1-lage serial correlation coefficient for the C-0 fit. This is ±0.21. (I learned how to do estimate this when I made the correlation diagram. 🙂 ) .
- I reran the Cochrane-Orcutt fit using the two ±95% values of the 1-lag serial correlation and checked to see if the 2C/century result was still falsified.
The result was: The IPCC predictions falsify when I use the highest and lowest possible values of the 1-lag serial correlation. 🙂
So, it would appear that the IPCC AR4 predictions just can’t be redeemed. That is, unless we can attribute the cold spell to a switch in some oscillation with a much longer time scale than ENSO and suggest that oscillation is unpredictable and has a substantial effect on global mean temperatures.
Can the PDO be that powerful? I don’t know.
Data sources
1. Data from GISS Land Ocean, UHA MSU, NOAA, RSS, and HadCrut
2. Data for first analysis: Blog post at Watt’s and text file.
Nicely done. I’ve given you a link and showed your graph on the main page and bumped your post into some of the more mainstream traffic generators.
Is it fair to say that “weather noise which is not white” is also called climate change? Or perhaps it can also be low frequency cycles like PDO, 66 year solar cycles, or something unknown?
Erik–
The “mean trend”, if upward, is considered the true warming trend due to AGW. This analysis is consistent with a positive mean, upward trend.
But, everything that’s not the “mean trend” is weather noise. Given my background — multiphase flow– I’d probably call it a “fluctuation”, rather than “noise”. But the climate modelers seem to like to call it “noise”.
Given the somewhat ambiguous definition of climate (which is actually ambiguous in terms of probability theory), it’s not entirely clear which things are “noise” and which are “climate”.
The way I’d think about it with a background in multiphase flow is that 66 year solar cycles are “climate”, and the PDO is “noise”. I’d do this not because of time scale, but because I tend to think of variable responses to steady forcings as “fluctuations”. For the most part, I think of long time scale variations in the sun as “forcings”, and so more like “climate”.
But, that’s arbitrary, and a climatologist could easily tell you otherwise. If they explained I might even know how, precisely, the define the difference. (If it’s just “a 30 year average”, then there are difficuties in terms of discussing the meaning of ensemble averaging.)
Also: Thank you for this work.
Given the trillions of dollars and millions of lives at stake if we get this wrong (either way) it is high time that we have these discussions. I don’t have any intuition for the Cochrane Orcutt method, and when to apply it, so I can’t personally be sure. But experts should be able to discuss this in civil terms and come to an agreement on your thesis.
Thanks for bringing this up and keeping the issue visible.
erik and Anthony– thanks for the comments.
Erik, Cochrane Orcutt is sufficiently common that it’s included in most stastistical packages. Like all statistical techniques, there are certain assumptions associated with the method. It just happens that they seem to apply in this circumstance.
I hadn’t learned it as an undergrad (or even a grad student) because in my specialty, we usually learn to know enough statistics to avoid doing experiments that result in messy data of this sort. So, in some sense, if I did a lab experiment and needed to do Cochrane Orcutt instead of ordinary least squares, it would mean I designed my experiment badly! In contrast, people in climatology, econometrics of other areas have to use the data available. It’s often messy in different ways. This techniques takes care of a specific type of messiness.
(I’m learning other techniques because later analyses will almost certainly involve data that are messy in different ways!)
Lucia
This is good to see. From your numbers, it appears that the t-statistic when you include the MEI is actually slightly larger than without it, as I thought it would be. The decrease in the range of uncertainty evidently more than compensates for the upward shift in the estimate of the underlying climate trend. By my reckoning, the likelihood of the underlying climate trend being as large as 2.0 deg C is reduced by about a factor of 2 when you control for the ENSO with the MEI.
I find it incredible that grown up scientists including meteorologists could not even conceive that global temperatures could drop, and drop dramatically for a long time. Apart from this blog, I would say this is a start http://news.bbc.co.uk/2/hi/7329799.stm. I am sorry if this upsets anyone, but in my estimation 2008 will be the year that global warming ended and put me on record for saying it
re previous this link should work http://news.bbc.co.uk/2/hi/science/nature/7329799.stm
Link posted over at the globalwarminclearinghouse.blogspot.com
Nice work! Been following your comments at Anthony’s house.
My first post here :).
In your analysis posted above, do you include the January and February 2008 data. If so, to play the devil’s advocate, if you remove these data points (which might be considered outliers) would the IPCC AR4 fall into the 95% confidence limits?
Thanks
gdfernan
Climate Change Cycles, Galactic Vacuum Density Waves, and the Orbital Periods of the Outer Planets
Dr. Gerhard Löbert, Otterweg 48, 85598 Baldham, Germany. April 4, 2008.
Physicist. Recipient of The Needle of Honor of German Aeronautics.
Conveyor of a super-Einsteinian theory of gravitation that explains, among many other post-Einstein-effects, the Sun-Earth-Connection and the true cause of the global climate changes.
Abstract: In a previous Note it was shown that climate change is driven by solar activity which in turn is caused by the action of galactic vacuum density waves on the core of the Sun. Irrefutable proof of the existence of these super-Einsteinian waves is given by the extremely close correlation between the changes in the mean global surface temperature and the small changes in the rotational velocity of the Earth – two physically unrelated geophysical quantities – in the past 150 years (see Fig. 2.2 of http://www.fao.org/DOCREP/005/Y2787E/y2787e03.htm). In the present Note it is shown that the orbital periods of the large planets Jupiter, Saturn, and Uranus provide further evidence.
In an excellent paper by the late Dr. Theodor Landscheidt (see http://www.schulphysik.de/klima/landscheidt/iceage.htm) it was shown that the Sun’s Gleissberg activity cycles are closely correlated with the oscillations of the Sun around the center of mass of the solar system. The first and second space derivatives of the gravitational potential of the planets in the vicinity of the Sun are, however, so minute that it cannot be envisaged how the extremely slow motion of the Sun about the center of mass of the solar system could physically influence the processes within the Sun. It is much more likely that a common external agent is driving both the Gleissberg cycle and the related oscillatory barocentric motion of the Sun.
The small motion of the Sun is, of course, determined, almost entirely, by the motion of the large planets Jupiter, Saturn, Uranus, and Neptune that revolve around the Sun with periods of 11.87, 29.63, 84.67, and 165.49 years respectively. Note that the sunspot cycle has a mean period of 11.07 years (see T. Niroma in http://www.personal.inet.fi/tiede/tilmari/sunspot4.html) and in my previous Note “A Compilation of the Arguments that Irrefutably Prove that Climate Change is driven by Solar Activity and not by CO2 Emission” of March 6, 2008 I pointed out that the mean surface temperature of the Earth is changing in a quasi-periodic manner with a mean period of 70 years, approximately. If we stipulate for the moment that there exists – in addition to the 70-years wave – a galactic vacuum density wave of 11.07 years period that is driving the sunspot cycle, then the addition of both waves leads to a periodic amplitude modulation with a period of 2/(1/11.07 – 1/70) = 26.3 years.
If two galactic gravitational wave trains of 11.07 and 70 years period were to pass through the solar system, the gravitational action of these waves on the revolving planets would slowly relocate these celestial bodies until the orbital periods were close to 11.07, 26.3, and 70 years, the periods given by the combined wave train. The orbital periods of Jupiter, Saturn, and Uranus are 7%, 13%, and 20% higher than these values. A cose lock-in cannot be expected because of the gravitational actions of the neighboring planets and because of the large variability of the periods of the vacuum density wave trains (see the large variability of the sunspot and surface temperature cycles).
If one considers all of the documented sunspot cycles, the mean Gleissberg cycle length increases to about 78.5 years (see T. Niroma) which is 7% smaller than the orbital period of Uranus. Note also that the orbital period of Neptune is 5% larger than 2 times the mean Gleissberg period and that of Pluto is 7% larger than 3 times 78.5 years. Finally, it is of interest that the orbital period of Mars is only 2% less than one-sixth of the mean sunspot cycle length.
Thus the barycentric oscillations of the Sun that Landscheidt envisaged to be the cause of the Gleissberg activity cycle is the mirror image of the trains of galactic vacuum density waves acting on the solar system.
In my opinion, the orbital periods of the large planets Jupiter, Saturn and Uranus provide — in addition to the extremely close temperature-rotation-correlation — further evidence for the existence of galactic vacuum density waves with mean periods of 11 and 70 – 80 years.
———————————————————
A Compilation of the Arguments that Irrefutably Prove that Climate Change is driven by Solar Activity and not by CO2 Emission
Dr. Gerhard Löbert, Otterweg 48, 85598 Baldham, Germany. March 6, 2008.
Physicist. Recipient of The Needle of Honor of German Aeronautics.
Program Manager “CCV, F 104G” (see Internet).
Program Manager “Lampyridae, MRMF” (see Internet)
Conveyor of a super-Einsteinian theory of gravitation that explains, among many other post-Einstein-effects, the Sun-Earth-Connection and the true cause of the global climate changes.
I. Climatological facts
As the glaciological and tree ring evidence shows, climate change is a natural phenomenon that has occurred many times in the past, both with the magnitude as well as with the time rate of temperature change that have occurred in the recent decades. The following facts prove that the recent global warming is not man-made but is a natural phenomenon.
1. In the temperature trace of the past 10 000 years based on glaciological evidence, the recent decades have not displayed any anomalous behaviour. In two-thirds of these 10 000 years, the mean temperature was even higher than today. Shortly before the last ice age the temperature in Greenland even increased by 15 degrees C in only 20 years. All of this without any man-made CO2 emission!
2. There is no direct connection between CO2 emission and climate warming. This is shown by the fact that these two physical quantities have displayed an entirely different temporal behaviour in the past 150 years. Whereas the mean global temperature varied in a quasi-periodic manner, with a mean period of 70 years, the CO2 concentration has been increasing exponentially since the 1950’s. The sea level has been rising and the glaciers have been shortening practically linearly from 1850 onwards. Neither time trace showed any reaction to the sudden increase of hydrocarbon burning from the 1950’s onwards.
3. The hypothesis that the global warming of the past decades is man-made is based on the results of calculations with climate models in which the main influence on climate is not included. The most important climate driver (besides solar luminosity) comes from the interplay of solar activity, interplanetary magnetic field strength, cosmic radiation intensity, and cloud cover of the Earth atmosphere. As is shown in Section II, this phenomenon is generated by the action of galactic vacuum density waves on the core of the Sun.
4. The extremely close correlation between the changes in the mean global temperature and the small changes in the rotational velocity of the Earth in the past 150 years (see Fig. 2.2 of http://www.fao.org/DOCREP/005/Y2787E/y2787e03.htm), which has been ignored by the mainstream climatologists, leaves little room for a human influence on climate. This close correlation results from the action of galactic vacuum density waves on the Sun and on the Earth (see Section II). Note that temperature lags rotation by 6 years.
5. From the steady decrease of the rotational velocity of the Earth that set in in Dec. 2003, it can reliably be concluded that the mean Earth temperature will decrease again in 2010 for the duration of three decades as it did from 1872 to 1913 and from 1942 to 1972.
6. The RSS AMSU satellite measurements show that the global temperature has not increased since 2001 despite the enormous worldwide CO2 emissions. Since 2006 it has been decreasing again.
II. Physical explanation for the strong correlation between fluctuations of the rotational velocity and changes of the mean surface temperature of the Earth
Despite its great successes, the gravitational theory of the great physicist Albert Einstein, General Relativity, (which is of a purely geometric nature and is totally incompatible with the highly successful quantum theory) must be discarded because this theory is completely irreconcilable with the extremely large energy density of the vacuum that has been accurately measured in the Casimir experiment.
Seaon Theory, a new theory of gravitation based on quantum mechanics that was developed eight decades after General Relativity, not only covers the well-known Einstein-effects but also shows up half a dozen post-Einstein effects that occur in nature. From a humanitarian standpoint, the most important super-Einsteinian physical phenomenon is the generation of small-amplitude longitudinal gravitational waves by the motion of the supermassive bodies located at the center of our galaxy, their transmission throughout the Galaxy, and the action of these waves on the Sun, the Earth and the other celestial bodies through which they pass. These vacuum density waves, which carry with them small changes in the electromagnetic properties of the vacuum, occur in an extremely large period range from minutes to millennia.
On the Sun, these vacuum waves modulate the intensity of the thermonuclear energy conversion process within the core, and this has its effect on all physical quantities of the Sun (this is called solar activity). This in turn has its influences on the Earth and the other planets. In particular, the solar wind and the solar magnetic field strength are modulated which results in large changes in the intensity of the cosmic radiation reaching the Earth. Cosmic rays produce condensation nuclei so that the cloud cover of the atmosphere and the Earth albedo also change.
On the Earth, the steady stream of vacuum density waves produces parts-per-billion changes in a large number of geophysical quantities. The most important quantities are the radius, circumference, rotational velocity, gravitational acceleration, VLBI baseline lengths, and axis orientation angles of the Earth, as well as the orbital elements of all low-earth-orbit satellites. All of these fluctuations have been measured.
Irrefutable evidence for the existence of this new, super-Einsteinian wave type is provided by the extremely close correlation between changes of the mean temperature and fluctuations of the mean rotational velocity of the Earth. (see the figure referred to in Section I.4). Einsteinian theory cannot explain this amazing correlation between two physical quantities that seem to be completely unrelated.
While the rotational velocity of the Earth and the thermonuclear energy conversion process on the Sun react simultaneously to the passage of a vacuum density wave, a time span of 6 years is needed for the energy to be transported from the core of the Sun to the Earth’s atmosphere and for the latter’s reaction time.
As can be seen, super-Einsteinian gravitation reveals the true cause of climate change.
gdfernan,
Yours is a commonly asked question. However, my main answer is: when you use smaller amounts of data, the main statistical error is called β error. That is, you fail to falsify hypotheses that are false. One of my commenters, Martin Ringo did the calculations for power of this test, and sent me graphs. I posted his excellent comments at the bottom of this post:
http://rankexploits.com/musings/2008/comparing-ipcc-projections-to-individual-measurement-systems/comment-page-2/#comment-1383
I need to write a more complete post on this, and explain why, if the IPCC projections are too high, but AGW, what we are going to see, over time will likely go like this:
A period of “fail to falsify”…. A short period of “falsify”…. A short period of “fail to falsify”…. A period of “falsify”…. A shorter periods of “fail to falsify”…. Finally, a long period of “falsify”.
So, the “falsify” findings are significant.
I am going to be writing up more on β error so people understand this better. The difficulty is that some other bloggers are confusing the issue about what can and can’t be shown in short times.
A question similar to whether the IPCC AR4 projection can be falsified is, “Has the earth stopped warming over the past ten years?” Recently there has been frequent mention in various media and blogs of the apparent flat line since 1998. Critics point out that the choice of 1998 as a starting point is a cherry pick because of the huge El Nino that year. So I applied the same anlysis that Lucia used on the falsification issue to the time frame from around 1998 to the present. The finding is that the critics have a point — kind of.
Using the average of RSS and HadCRUT data, I chose a start time that would capture the full 1998 El Nino and simultaneously give an OLS slope of zero. This defined a base case starting point for subsequent analysis. A start time of July 1997 filled the bill. The computed 95% confidence range for the OLS fit was from -0.8 C/century to +0.8 degree/century. The AR(1) correlations were very high at 0.78. After controlling for ENSO via the MEI as Lucia did and after applying Cochrane-Orcutt, the best estimate for the underlying climate trend over this 10 year-8 month period is m = + 0.3 C/century, with a 95% confidence range from
-0.8 C/century to + 1.4 C/century. AR(1)correlations were 0.05 and longer period correlations were also small.
Thus, taking ENSO effects into account and dealing with correlations retains the lower end of the range obtained with the OLS fit but extends the upper end of the range to higher values. The small upward trend compensates for a small net cooling effect of ENSO over this period. So the critics are right in that ENSO effects have biased the raw temperature trend. However, the effect is small, and obviously a hypothesis of null underlying global temperature change cannot be rejected. The probability that there was any warming at all during this period is 0.69. One can certainly reject an underlying trend of 2.0 C/century once again, as in the case of Lucia’s 2001+ analysis.
So has the earth’s temperature flat lined over the last 10 years? It depends on what you mean by ‘flat.’ I’ll go with ‘statistically flat.’
Lucia:
Thanks for your offer on Anthony’s blog of an Excel macro to reformat the MEI data. I only have Open Office and minimal skill with it. At one time I had access to and great skill with SAS. My access disappeared 10+ years ago with the mainframe. My brute force cut and paste was adequate for a one time project.
Yours was the only response to my suggestion of further study of the MEI behavior. Suggesting assignments to busy people often gets that kind of reception. I was fascinated by the technique demostrated by Basil and Anthony of using the H-P Filter and study of the residuals. The cumsum plot really highlighted what I interpret as a negative PDO switching to positive PDO.
It is unfortunately true that suggesting different analyses rarely works for a number of reasons. The people actually doing their own analyses generally currently doing one they think promising. Alternatively, they have just done something and wish to document it. So, to take up a suggestion quickly, they need to drop what they are doing. Dropping unfinished analyses is a good way to get nothing done!
But, I do read what people suggest, and if I can do it quickly, I often do.
I also find some of the things Basil and Anthony are doing fascinating. However, unfortunately, I don’t yet know how to do some of those things. So, I won’t be trying anytime soon!
First, good work in the treatment of an omitted variable you readers noted to you. Addressing criticism, friendly or otherwise, is not just human virtue, it is a statistical methodology virtue. Further, your analysis of the reduction in the second order serial correlation is also good (although I am going to raise some questions about it and the Multivariate ENSO Index variable later).
Second, I no longer have technical questions about the straightforward (out of the regression summary statistics) hypothesis testing conclusions. (Note: I disagree with Lucia that one can reject the null hypothesis of a 0.02 degree C linear trend in for the sample 2001:01 – 2008:02, but this is not because she is not approaching the question correctly. It is only that I have a somewhat more powerful estimating routine and have made some Monte Carlo studies of the standard errors or the estimators for temperature like series. So I believe it was still necessary to adjust the standard errors. Not in the manner of Lee and Lund or Nychka, but based on better estimating procedures and experimental results. But even if applied here, there would be significance.)
Third, is not the MEI a bimonthly variable and partly composed on temperature indices? As such part of MEI is a two month average of temperature itself, which has obvious regression problems in an equation which already has a lagged variable. Further, if you take a look at the correlogram of the series composed of temp(t)+temp(t-1), you will see it closely resembles that of the MEI, i.e. a roughly 95% first order autocorrelation with a something like negative 30% second order autocorrelation (while temperature itself has a positive second order autocorrelation). I suspect this is where you got rid of the second order serial correlation effects. (Note: those effects were not so great that they would cause one to reject, at least on first blush, the Cochrane-Orcutt estimation. Rather, they appear in the estimation of the AR2 term in an AR(2) estimation procedure.)
That is enough for now, but let me warn all that the fundamental problem that Lucia posed early on – that of testing the consistency of a linear trend with the post 2000 experience – is something that might be better addressed in a multi-equation model. The effect of the MEI variable should give a hint as to why I believe so.
Mike, I believe there are plugins for R in Open Office. So If you survived SAS then R should be no problemo.
I wish you good luck with Beta. Maybe a Simple example of how many coin flips would it take to find out that
that a fair coin was in fact biased by 10% . Something simple
I liked beta. beta make experimental budgets big.
Marty
ENSO uses a temperature, but not the global mean surface temperature. The NOAA page says
http://www.cdc.noaa.gov/people/klaus.wolter/MEI/
There is an issue of some overlap. I think, in particular the temperature they call (A) would be the problem. The air temperature over sea surface temperature over the tropical Pacific does contribute to the global mean surface temperature. However, it’s contribution to MEI would appear to be 1/6th and the contribution to the GMST is also small. (But it is there!)
The first period of climate is one month, because that is the first anomaly period derived. From there, the quarterly and yearly are derived, then we go to decades and then multi-decades. Some are of the impression that 30 years is a minimum and yes, that is the customary base period for determining the anomaly over months, quarters and years. However, other periods can be of use, depending upon the purporse. The longer a period under consideration, the smoother the trends become, and therefore more statistical and less physical.
THE CENTRAL SUBJECT OF CLIMATOLOGY
In my opinion the researchers in climatology should put aside their work for a moment and focus their attention on the central and decisive subject of climatology. This is the extremely close correlation between the changes in the mean surface temperature and the small changes in the rotational velocity of the Earth in the past 150 years (see Fig. 2.2 of http://www.fao.org/DOCREP/005/Y2787E/y2787e03.htm), which has been ignored by the mainstream climatologists.
Since temperature cannot influence rotation to the observed degree and vice verca, a third agent must be driving the two. The solution is given in http://www.icecap.us/images/uploads/Lobert_on_CO2.pdf .