By way of Watts Up with that I learn that the Met office is going to re-do HadCRUT. Good.
Like the Met office, I do expect the new record to show roughly the same amount of warming over the 20th century as the old record. The bullet points says”
The new effort, the proposal says, would provide:
–â€verifiable datasets starting from a common databank of unrestricted dataâ€
–â€methods that are fully documented in the peer reviewed literature and open to scrutiny;â€
–â€a set of independent assessments of surface temperature produced by independent groups using independent methods,â€
–â€comprehensive audit trails to deliver confidence in the results;â€
–â€robust assessment of uncertainties associated with observational error, temporal and geographical in homogeneities.â€
All good.
When I read the proposal itself I wondered:
- Where is the money coming to pay people’s salaries? The UK? France? The US DOE? PIIGS? Everyone?
- The Met office proposal says there will be a full audit trail. Do they mean codes and scripts be made available to the honest to goodness public? Or are they borrowing the word ‘audit’ but mean something else?
- I like the idea of independent groups creating independent products. Have the groups already been selected? Are they going to be GISS, NOAA and The Met Office? Or will there be a call for proposals? (I nominate Chad.)
- I like the idea they are going to try to get somewhat realistic uncertainty estimates and verify methods by adding realistic error to climate model data and then seeing if they can pull out trends. Can they also run a challenge where hostile bloggers create the test cases of their own devising and see whether the algorithms pick out the the trend?
All in all, I think this is a positive turn of events. I expect we’ll learn more details later on. I’ll be watching for news but I bet Anthony will be the first to find every story. He often is.
I’ll suggest a couple more safeguards.
1. Institutions should supply raw data and metadata.
2. Where the data submitted is a subset of all the data under their
control, they have to document the methodology for selecting
the data they submit. No cherry picking. This decision needs
to be audited.
3. Invite McIntyre and Watts to the workshop.
Any other suggestions?
I think it depends on whether or not this is an honest effort to create an open and transparent process or just an attempt to rehabilitate their image. Time will tell.
.
BTW lucia, I ran a QA check on the Aqua satellite AMSU January data used by UAH. In a nutshell, _none_ of the 7000+ data files passed NASA’s QA and _none_ of the readings for channel 5 are valid. Details are here: http://magicjava.blogspot.com/2010/02/uah-january-raw-data-spot-check.html
.
These are preliminary results but it’s amazing what you find when you scratch the surface.
This should be interesting. Hopefully they will also get funding to update GHCN beyond the GSN stations, since its a tad overdue.
It will be more fun than the florida recount.
Tm moshpit
Steve said:
“It will be more fun than the florida recount.”
With Al Gore a loser again?
amortiser..
good to see you.
Al Gore the loser? I try to keep my expectations in check.
Don’t expect huge changes. just work to make the process open and transparent.
Other suggestions:
1. archive of original records (digital images of the forms, photographs/diagrams of station sites both current and historical)
2. meta-data on station siting and changes
3. full description of raw data audit rules (corrections, missing data, etc.)
4. processing algorithms and code with design specifications
5. equipment specifications and effects of instruments on temperatures (CRS paint type, sensor drift, etc.)
New HadCRU? Who’s funding it?
This fits the standard M.O. of our civil servants is when they screw up. It is a pro-active approach to divert criticism from themselves while at the same time fund their activities for a few more years. They hope that by that time all of the negative publicity will have blown over.
Are they talking about a starting with a set of raw data without adjustments of any kind and then creating subsequent data sets incorporating well documented adjustments? Or are they proposing establishing a set of adjusted data to use as a base?
From what I have read it could be either. If the later then what use is it, as a signal could be built into the “peer reviewed” methods of adjustment?
And a question about the current data sets. When Hadcrut or GHCN get their data, are they recieving raw data or data which has been adjusted by the local weather authority upon which they do further adjustments?
Thanks
Re: magicjava (Feb 23 15:17),
That’s very interesting!
I have checked some of the few stations from Switzerland. GHCN seems to consider data from the Swiss Met office as raw, although the data corresponds to the homogenized data from Switzerland http://www.meteoswiss.admin.ch/web/en/climate/climate_today/homogeneous_data.html
Raw Swiss data is available for a fee only.
Re: John Knapp (Feb 23 16:57),
I don’t think we can tell from the written proposal. Perhaps those who attended the meeting know. Heck, it’s even possible the people who wrote the proposal know. The nature of some of these things is for them to “evolve”.
“Like the Met office, I do expect the new record to show roughly the same amount of warming over the 20th century as the old record.”
Lucia,
Sincerely asking…why do you expect this?
Andrew
It’s already done?
National Climatic Data Center
DATA DOCUMENTATION FOR
DATA SET 9950 (DSI-9950)
DATSAV2 SURFACE
January 6, 2003
1. Abstract: DATSAV2 is the official climatological database for surface observations. The database is composed of worldwide surface weather observations from about 10,000 currently active stations, collected and stored from sources such as the US Air Force’s Automated Weather Network (AWN) and the WMO’s Global Telecommunications System (GTS). Most collected observations are decoded at the Air Force Weather Agency (AFWA) formerly known as the Air Force Global Weather Central (AFGWC) at Offutt AFB, Nebraska, and then sent electronically to the USAF Combat Climatology Center (AFCCC), collocated with NCDC in the Federal Climate Complex in Asheville, NC. AFCCC builds the final database through decode, validation, and quality control software. All data are stored in a single ASCII format. The database is used in climatological applications by numerous DoD and civilian customers.
LOL. Do you really think this is gonna happen in your lifetiime?
If this really is to happen, it will take 50 years or more. Please don’t tell me that the public is so stupid….
The Data should be collected and held by The UK Statistics Authority.
the only thing in the proposal to prevent the greenpeace led MetOff from doing a GISS is to make the raw data available on line. I can’t see it.
I’ve already said it elsewhere, but I’ll say it here too. We need to start a 100% open source effort, with no preconceptions built-in, to write code to process and make use of the new data set.
Perhaps even if this open source effort becomes widely adopted, it can even influence the type of data to be collected (more data and more open surely is in everybody’s interest?).
This efforts needs to be begin immediately and proceed on an urgent basis.
Start the project now and continually release it under an open source license.
The code should not incorporate any particular set of adjustments and assumptions, but instead could include the ability to see the results based on loading different sets of adjustments and assumptions.
This would then clearly allow us to see the effects of different modelling techniques, and see which results held under which assumptions.
Surely nobody with a genuine interest in science, whether currently holding warmist, luke-warmist or skeptical views, would oppose such a project?
Like the Met office, I do expect the new record to show roughly the same amount of warming over the 20th century as the old record.
Wrong. First thing they have to do is fire each and everyone of these useless MoFo’s that currently inhabit the Met office.
Start from scratch with personnel who haven’t made a career out of defrauding the public.
I’d put Harry in charge, the computer programmer who authored the Harry file.
I mean after you get the code right, what does it matter who reads the thermometer, as long as they’re honest…
“3. Invite McIntyre and Watts to the workshop.”
Why would you invite biased people? Should they invite Joe Romm and tamino too?
cogito (34558)
Here one can see a graphic comparison between raw and homogenized data in Switzerland. Please notice Sion:
http://www.meteoschweiz.admin.ch/web/de/klima/klima_heute/homogene_reihen.Par.0054.DownloadFile.tmp/vergleichoriginalhomogen.pdf
(Cogito has more info)
The collection of (raw) data should be done by somebody who has no interest in what the numbers mean, like accountants or a statistical agency.
Lucia, I’m a long time lurker here and this is my first post on your excellent blog!
I’m also pleased to see that all the NOAA/GISS/CRU instrumented temperature data auditing efforts of Steve Mc, Anthony Watts, ChiefIO, Jeff ID, hpx83, Verity Jones, NicL, Lucia, Tamino, Chad, Nick Stokes, Giorgio Gilestri to name only a few have finally paid off.
In particular I hope that once the global raw instrumented temperature dataset has been re-established, that full acknowledgement and appreciation is given to the significant warming and cooling trends of the 19th and 20th century.
I also hope that the current obsession with ‘anomalising’ and ‘gridding’ the data is dropped as IMO it is not necessary to anomlise and grid the raw data to see the clear cyclic warming and cooling trends of the past two centuries.
I also hope that something is done about the post 1990 ‘station drop out’ issue i.e. that the data is brought up to date, and that the ‘missing months issue’ is also dealt with in a much more appropriate manner. I also hope that all available temperature data for the rest of the world is found and added to the dataset not just data post 1950 as at present in GHCN. I don’t believe for one minute that the sudden increasing in reporting station post 1950 was due to the construction of airports world wide due to expansion in the aviation industry.
It’s only after the latter three issues highlighted above are dealt with properly that IMO we will then be able to properly assess whether or not the late (1970 to 2000) 20th century warming trend was any more significant than the 1910 to 1940 warming trend.
In doing so (i.e. ensuring that we can can compare the two warming periods correctly) due allowance must be made for the effects of station moves, instrument changes, land use changes and urbanisation (UHI effect). It is not IMO sufficient to apply an ‘algorithm’ (as done at present by NOAA/GISS/CRU) to attempt to allow for these changes over time (i.e. to ‘homogenise’ the data), but rather it is vital that sufficient meta-data be collected for each and every individual station (as done in the surfacestations.org project) which can then be used make due allowances the above highlighted issues on an INDIVIDUAL station basis.
As a UK taxpayer I’ll be happy to fund and even participate in this process free of charge, provided it adheres to the conditions I’ve expressed above. Regardless of the costs of this project, it will be small beer in comparison to the costs we are currently spending on funding organisations like the Tyndall Centre to look at masures to mitigate and adapt to supposed man-caused climate change when in fact at this point we are far from certain as to whether or not man has actually caused the problem and for that matter whether it is even a problem at all.
Boris (Comment#34592)
‘…Why would you invite biased people? Should they invite Joe Romm and tamino too?
Boris, Why would you think Grant and Joe are biased?
‘ Boris (Comment#34592) February 24th, 2010 at 6:10 am
“3. Invite McIntyre and Watts to the workshop.â€
Why would you invite biased people? Should they invite Joe Romm and tamino too?’
Yes. Remember that when the WSJ and the NYT recounted Florida 2000, everybody could believe their result (except Krugman, it did not fit his prejudice).
But Watts and McIntyre are much too important for the progress of science to do such accountants work.
‘Boris (Comment#34592) February 24th, 2010 at 6:10 am
“3. Invite McIntyre and Watts to the workshop.â€
Why would you invite biased people? Should they invite Joe Romm and tamino too?’
Yes. Remember that when the WSJ and the NYT recounted Florida 2000, everybody could believe their result (except Krugman, it did not fit his prejudice).
But Watts and McIntyre are too important for science to do the work of an accountant.
On 2nd Feb I faxed the DoE asking if they were still finding Jones/CRU/UKMO
No reply.
I faxed my letter to;
U.S. Department of Energy,
Office of Biological & Environmental Research,
Climate and Environmental Sciences Division,
1000 Independence Ave., SW
Washington, DC 20585-1290
Fax: (301) 903-8519
2nd February 2010
Attention: Dr. Wanda Ferrell, Acting Director
DoE funding of climate research
The full text is over at my site.
Maybe some US people would like to rattle the cage at the DoE – its their taxes being used.
Re: WSH (Feb 24 15:21),
Before everyone swoops down on poor Wanda (who my husband will testify is a very nice lady) contact a reference librarian to find the list of funded research programs.
I’m pretty sure the DOE has a searchable database, I can’t remember where to find them because I never do that.
They should do it all openly, from the beginning. Set up their raw data system in the open, then give the public access to all of their document and software version control systems. We can all look over their shoulders… and comments from outsiders might save them some work.