Today, just in time for the Wed. update, the 7-day average JAXA Ice Extent had an uptick which is so small it’s not visible in the graph below:

Last night, Neven indicated he thinks we’ve hit the minimum. I’m not so sure.
There is an uptick in the 7 day minimum, so I think we might have a minimum; I’ve noted that in the right hand corer of the graph. Now that we’ve achieved this milestone, it’s time to start watching for confirmation.
Here are some details:
- The Daily JAXA NH Ice extent stands at 4589844 square km. This is a rise relative to last night’s preliminary report of 4565781 square km. and exceeds the current minimum of 4526875 sq. km set on September 9 by 62969 sq. km. This is not a “safe” amount. A big storm, blow or other event could, hypothetically result in some renewed ice loss. Neven looks at weather charts; I don’t. So possibly, he has good reason not to expect these; I’m agnostic on that point.
- The first uptick in the 7 day smooth is not infrequently a sign that we have achieved the minimum 7 day NH extent. It has been so during the relatively brief JAXA record. But before ‘calling’ a winner, I would at least like to see that today’s 7 day average is higher than last week’s 7 day average. It’s not. The loss rate based on the current average relative to the average 7 days ago is illustrated by the black trace below:

By this measure, the loss rate is still positive. (Note: grey illustrates the loss rate based on changes in daily values reported by JAXA. It’s noisier but also shows the most recent loss rate.) - The predictometer suggests some loss is more likely than not. Specifically, the median anticipated minimum 7-day average JAXA extent is 4.484 million sq. km, which is lower than the current value of 4.547 million sq. km. So, the median estimate says to expect a loss of 0.06 millions sq km. That’s not much, but it’s above zero.
The range of possibilities based on the best regression for extent vs. any of the predictive values I consider is shown below:
Using a new fancy-schmancy of the predictometer which models residuals of candidate models as skew-normal, the ±95% confidence intervals for the ultimate 7-day average JAXA NH ice extent are 4.367 to 4.563 million sq. km. Obviously, 4.563 is not possible–that’s partially an artifact of using the continuous skew-normal density function to estimate the probability density of extent and partly an artifact of predicting by using regressions; using a normal distribution would be worse. However, the fact that 4.563 million sq km is higher than the current value indicates that the predictomer is telling us the current value may be the minimum. But since the median falls below the current value, this is not certain.
Knowing how the predictometer works, I think it’s going to be very conservative about decreeing there is a better than 50% chance we’ve achieved a minimum in the 7-day average extent.
By the way: The predictometer claims the probability the 7-day average JAXA NH ice extent will fall below the minimum set in 2007, is 0.1%.
We are so close to the end of the horserace I’m going to omit the graph showing the distribution of bets. But I will reveal: using the mean based on the Gaussian model, RobB is in the lead! My bet is above the minimum– but it’s close. Since one wins by being closest, it’s ok to be above. I’d have to look at the file to figure out how many people are closer to the current minimum than I am. I suspect it’s going to be quite a few people and I’ll be losing all my quatloos!
I agree with you Lucia, it’s a definite maybe that we’ve reached the 2011 minimum. Given the Fall equinox is in a week and the risk of ice compaction, I think it’s about 75/25 odds that we’re at a minimum. As a side note, Resolute, Nunavut, Canada (74 N) is under a snowfall warning and is predicted to have a mean temperature of about -6C for the remainder of the week.
Fred N.
The ‘predictometer’ says the probability we’ve hit the minimum is 13% if I fit the residuals for the candidate models to a gaussian and 9% if I use skew normal. Neither can be taken too seriously because these continuous functions just can’t deal with the hard stop in the probability distribution function at zero losses.
So, I have a ‘formal’ way to report a probability, but the caveat is so important it’s not worth focusing on the numerical estimate of the probability. your 75/25 is probably as good as the predictometer estimates.
The temptation to imagine how I might spend all those quatloos is too strong to resist! Cmon – declare the winner!!
RobB–
The time for decreeing the winner was part of the rules. I had to pick a safe end date so we could be sure the minimum didn’t happen after I decreed the winner!
Re: lucia (Sep 14 15:15),
See for example last week’s USC/Utah football game where the final score changed 2 hours after the game ended. Some sports books in Las Vegas ended up paying to bettors on both sides of the point spread.
DeWitt– Wow! That was costly for the bookies.
The weather really isn’t cooperating. Expect a few more small upticks for the coming 3-5 days. If the weather switches completely (which always takes time) there might be a big damper on the growth, but I really don’t think IJIS SIE will go down lower than it already has.
.
The thin ice and warm water have done all they can and extended the melting season by 10 days or so. Apparently we haven’t arrived at the point where melting continues* past September 10th whatever the weather conditions, although we had a bit of a foretaste of that this year. And next year, if the ice doesn’t thicken considerably the coming winter.
.
When a big spike on the DMI 80N temperature graph starts, you can definitely forget about a lower minimum. That will be the final nail in the minimum coffin.
.
.
* That’s to say, there still is bottom melt, but not enough to compensate for big time divergence and freezing temperatures.
Neven– You may be right. I just don’t think it’s certain.
More graph only data. Clearly, I need that for the “predictometer”.
(I’m sure I can find arctic temps somewhere. )
As lucia said, there is still a possibility of a lower minimum yet.
Over the next five days, the extent has declined by up to 195,000 kms (the loss in 6 different years is high enough to offset the increase which has already occurred) and the extent has also increased by up to 320,000 km2 (with 12 years over 100,000 kms of increase).
If one goes out farther than five days there is only one or two years which would declined enough to offset the increase which has already been recorded. So, the odds are with the minimum already being reached but there is about a 6 out of 38 odds of lower minimum occuring yet.
BIll, I would put the odds lower that 6 of 38, all things aren’t equal… as Neven says, the weather isn’t “cooperating”.
Lucia:
What do you need?
I have a digitized version of DMI going back to 1958.
(It’s not perfect because it was generated from their raw png images. I did it as an exercise in “autotracing” images of plots.)
Neven– You may be right. I just don’t think it’s certain.
.
That’s why my motto is: “Nothing in the Arctic is a dead certainty”. 🙂
Carrick–
That will do! I could archive that. Then, all I would do is read in your file.
But then in future days, I would need to update the file so I get whatever data are more recent. (So, say, next summer etc, each week). Obviously, I can’t turn you into a drone who refreshes that file every day the way DMI does, so I’m going to need to figure out how to digitize these automatically myself.
I found a digitizing routine in R, but it requires manual intervention– which won’t do for a blog “predictometer”. So, I’ve been hunting for one. If someone knew of a digitizing routine in R that doesn’t require manual intervention, I’d love it!
Re: Neven (Comment #81537)
Cute. Very cute.
Lucia, I can set it up in cron to update daily, since it’s fully automatic. I’ll let you know when I’ve done that.
I’ll also expand on how I did the digitization later today. This is definitely something that should be code-able in the R language.
There are a few tricks here, because you don’t have to solve the general problem of digitization (hint: the figures are similar to each other and the color for the DMI temperature is the same in all figures. If you can figure out how to read the images as 2-d arrays, 95% of the hard work has been done for you.)
There’s been dips well after this date. Another 15 days would be prudent.
JAXA took a big jump up today, +65,156 km². MASIE, OTOH, set a new low of 4,302,978 km², or -95,342 km² from the previous day. JAXA extent is now more than 100,000 km² above the minimum, but the MASIE result makes me less confident that we’ve actually seen the minimum.
NSIDC also calls the minimum.
.
And there’s something that could be the start of a first spike on the DMI 80N temperature chart (meaning the top layer of the ocean is giving up its heat to the atmosphere so that ice can form).
.
Here’s my post.
Its interesting, and I think its too early to announce a minimum for sure. Some visual inspection of MODIS looks to me like quite a lot of fresh ice has now formed to fill in the holes in the tongue towards Siberia. At the same time the holes near Greenland seem to be much clearer, and I believe this is where the coldest air temps are. The first sign of new ice I saw was in the fjords on the NE side of Greenland and Ellesmere a few days ago.
But I think that late season the melting ends last on the Atlantic side because of later season warmth in the ocean from further south. Over the next few days weather patterns suggest a burst of warm air and wind from the Atlantic side. 2005 was one year with a very late dip, and this did seem to occur on the Atlantic side.
At the same time as most of the holes seem mostly filled in on the Siberian side growth there may slow. Although there is a lot of thin ice smoke around that area that may thicken just enough to trigger thresholds for detection and result in ‘flash freezing’. My guess is that this weather will push extent down over the next few days again, and it won’t quite reach a new minimum.
Lucia:
So I’ve modified the script so it now updates dmi.series.txt daily. Here’s a description of how I generate this file.
If you are on a PC, you’ll need to download cygnus windows and install netpbm, wget and the option for the compiler. On a Mac, you need XCODE installed, and on LINUX system, you just need netpbm
So here are the steps:
1) Snag the images
They are in a format like:
http://ocean.dmi.dk/arctic/plots/meanTarchive/meanT_year.png
so just write a shell script to download all of these.
2) Convert png to ascii. The files are all 600×400 so you just need a list of red, green,blue values.
I use pngtopnm from the netpbm package to convert to the “P6” format. Unfortunately I have never figured out a way to force it to ascill (“P3”) files, so I just wrote my own program (it was literally faster than going through all of the docs trying to find something that allowed you to do this).
You can find pnmascii here. You’ll need to compile it for your system of couse.
For example, once you’ve downloaded 2011, you just run:
pngtopnm meanT_2011.png | pnmascii > meanT_2011.txt
(NOTE: For none UNIX users, the “|” is the “pipe” symbol, it connects the output of sort into the input of pngtopnm to pnmascii).
You can now find out what colors are present by typing:
sort -u /tmp/meanT_2011.txt
which produces the output
0 0 0
0 128 255
0 192 0
255 0 0
255 255 255
Comparing this to meanT_2011.png
we see that
black = 0 0 0 — this is the border text.
green = 0 192 0 — this is the “climate” curve.
blue = 0 128 255 — freezing point of sea water
red = 255 0 0 — this is the actual temperature series
3) [Optional] Write an awk script to pull out the different color layers. I call it maskg.awk (it just generates a gray-map PPM file for a given color).
For example to generate the black layer, you would type:
maskg.awk r=0 g=0 b=0 meanT_2011.txt | pnmtopng > meanT_2011.black.png
Here is the output for black:
black
(I’d show the others, but I’m afraid I’ll get hit by by the spam filter for too many files.)
4) Write a program to convert the text into year, temperature pairs.
To use this you need to note that prior to 2008, the file format is slightly different (they added a text string at the bottom left hand corner.)
For 1958–2007, the conversions are:
width = 600;
height = 400;
y0 = 360; bottom of graph
y1 = 38; top of graph
x0 = 61; left side of graph
x1 = 579; right side of graph;
Y0 = 235; minimum temperature
Y1 = 280; maximum temperature
X0 = 0; minimum day
X1 = 365; maximum day
clipL = 533; clipping box to remove the legend (Left, Right, Bottom, Top)
clipR = 565;
clipB = 48;
clipT = 48;
And here’s my AWK program called genXY.awk which does this.
For 2008-2011, the only changes are
y0 = 347;
y1 = 13;
clipB=36;
clipT=36;
So if I want to generate the digitized curve for 1958, I would type:
anytopnm meanT_1958.png | ppmascii | genXY.awk > meanT_1958.temp.txt
(genXY.awk has the “mask” color in it.)
To generate the climate curve for this year type:
anytopnm meanT_1958.png | ppmascii | genXY.awk clipB=61 clipT=61 r=0 g=192 b= > meanT_1958.climate.txt
This will produce a “scatterplot” version, which isn’t particularly helpful, but you can at least plot it for comparison. (If you connect the lines, it will give you something really nasty.)
To get a usable x,y series of points, here is the command I run:
pngtopnm meanT_1958.png | ppmascii | genXY.awk | sort -n | averageBuffers.awk > meanT_1958.txt
“sort -n” sorts the file in numerical order (by day of year). “averageBuffers.awk” is a standard program of mine that averages all “y” values for the same “x” value.
This is what the output file should look like:
For 2011, you use the command:
pngtopnm meanT_2011.png | ppmascii | genXY.awk clipB=23 clipT=23 y0=347 y1=13 | sort -n | averageBuffers.awk > meanT_2011.txt
Note you just add the new values for the top, bottom and clip box for the years 2008…2011.
After having done this, the next thing I do is cubic spline these files into 0…365 (starting at 0.5 and ending with 364.5), then I simply concatenate them to generate the final output.
All of this is certainly doable within the framework of the R-langauge. Maybe somebody that actually likes programming in R can do this. The nice thing about R is it is all self contained, the problem with R is it isn’t as amenable to being incorporated into scripting languages, or if it is, I haven’t run across how to do it yet.
Hopefully this is of use! For people like Lucia who prefer to “roll their own code” it should be in enough detail to replicate what I’ve done. For anybody else, the script should update the textual dmi file until my server dies, I die or some combination of the two, or DMI makes the ascii versions publicly available…
A I said above, I was originally interested in how hard it would be to automagically pull the data from graphical images that the ascii versions weren’t being publicly provided as a link.
One note added to comment #81579 (if it ever makes it out of
purgatorythe SPAM filter), I’ve hardwired my awk program to /usr/local/bin/awk.If you try my shell scripts, you will have to change this to the path that your awk files are in. You may also have to type (e.g.):
chmod +x genXY.awk
to make them executable.
Secondly, depending on how you have your path set, you may need to run them as “./genXY.sh”. E.g., this:
pngtopnm meanT_2011.png | ./ppmascii | ./genXY.awk clipB=23 clipT=23 y0=347 y1=13 | sort -n | ./averageBuffers.awk > meanT_2011.txt
(Or just add “.” to your path, put it at the end and there’s no risk associated with it.)
Michael-
09,14,2011,4655000
09,15,2011,4700000
It’s up again. Looks like Neven was likely right calling the minimum. Since I post on Wed, I’ll wait until then to officially concur. 🙂
I suspect all the script can be incorporated into R. They usually can– I’m just not a guru yet.
Thanks!
Lucia:
It’s against my principles to make an early call for a minimum in something that is so very unpredictable, but I think every meteorologist who saw the persisting blocking high pressure dome probably quietly thought “we’re there.”
Since most of the ice extent change comes from mechanical atmospheric-ocean interactions, the presence of this high basically puts a “clamp” on the main mechanism for redistribution/loss of ice.
Still I think you’re wise to wait a few more days to make the call.
lucia:
If I could figure out how to read a png directly into R, I could probably muddle through the rest.
By the way, I mangled the sentence explaining what a pipe is. It should have read:
pngtopnm meanT_2011.png | pnmascii > meanT_2011.txt
(NOTE: For none UNIX users, the “|†is the “pipe†symbol, it connects the output of pngtopnm and connects it to the input of pnmascii).
Ah, OK. Here’s a package which appears to do this.
Now I just have to figure out how you add compiled packages to R. As I said, I’m basically an R-language noob.
Another new record sea ice low, although I’m probably the only one keeping track. The 365 day running average of Cryosphere Today global sea ice area was 17.644 Mm² today compared to the previous low in 2007 of 17.6452 Mm². That 2007 low was caused mainly by the very late freeze of Arctic ice in 2007. That doesn’t seem to be happening this year. CT Arctic area took a big jump up today and MASIE was up as well.
Lucia, looking at the rapid rate of growth of
Extent
http://www.ijis.iarc.uaf.edu/seaice/extent/AMSRE_Sea_Ice_Extent_L.png
and, especially, Area
http://www.ijis.iarc.uaf.edu/seaice/extent/AMSRE_Sea_Ice_Area.png
could I ask, somewhat tongue in cheek, if this is a rate of increase record for September?
Gras–
For JAXA which only goes back to 2002? Or including GSFC? The answer is probably no either way, but I’d have to check.
The five day increase was very high compared to the average but not the record numbers.
Further to the issue of fitting an accelerating trend (or a polynomial) to the sea ice minimum that Neven, Serreze and Stroeve have done that could result in no summer sea ice in 2030 or 2045 or some year in the near future:
Here is what that looks like if we extend the trends forward “and backward”. This results in all kinds of wierd trends that did not happen and/or do not signal a global warming-driven acceleration.
2nd order poly and then a 3rd order poly.
http://img198.imageshack.us/img198/7831/nhseastrpoly.png

http://img839.imageshack.us/img839/5459/nhseastr3poly.png

Bill–
That’s why extrapolating based on curve fits is dangerous. I only do it to reality check people who are over-confident about betting on the future. The trend based on a curve fit *might* persist for a while. But it might not.
Assuming this isn’t a fluke, it looks like this is the earliest upturn in the JAXA data.
Re: BarryW (Sep 17 19:50),
There’s some reason to believe it’s a glitch in the passive microwave based satellite data. MASIE, which uses visual imagery primarily as well as passive microwave, reached a minimum on day 257, five days later than JAXA. It also hasn’t shown the rapid recent uptick in extent that has been apparent in the JAXA extent and CT area data. JAXA extent was also down ~20,000 km² yesterday after being up for six consecutive days and looks to possibly be in a cyclic pattern that will produce a few more down days. But who knows. That’s the fun thing about sea ice, only the gross features are anything like predictable.
There is an unusual up, down and back up again cycle in the daily change of sea ice extent over day 257 to day 261 (Sept 17).
While there is some variability in the average changes by day, this up, down, back up cycle shows up clearly enough in the average over 31 years and is being repeated in 2011, that it is probably a feature of the Nasa algorithm.
There is also an unusual bump on day 182 and on day 274, probably also related to the algorithm which perhaps Masie doesn’t share.
http://img29.imageshack.us/img29/7451/nhsiedmrsept1711.png
Bill–That’s interesting!
Carrick officially gets mad props from Mosh.
that my friend is a hella cool hack.
Good to see you found Simon’s package in R.
Lucia, give Simons package a try. Simon haunts the R help list so if you have any problems he can help.
You should also be able to do it with Paul Murrel’s new addition to graphics ( raster ). So if you pose a nice question to the R list, Paul may fly in to help.. he likes problems like this.
I’m deep in some other stuff, if I get bored or stuck I’ll look at it.
Thanks Steven. As I said, I’m a complete noob at R or I’d just sit down and do the translation myself.
(If it did a better job than it does on the numerical aspects, I’d use it, but there are some issues that I find a bit disturbing. See this link at Jeff’s blog and this one at Jstults. for the backstory on that comment.)
Arctic temps from ERA Interim
http://gfspl.rootnode.net/klimat/arctic/
http://gfspl.rootnode.net/klimat/arctic/arctic_ts.txt
Hey Lucia! I guess I will post this here, but I was wondering when you’ll do the finalized post on ice extent and who won and so forth. I’ve been enjoying your series and calculations, so be fun to see how it all turned out in the end (as the ice is obviously well on its way to regrowth now).
Ged– Not anticipating an early minimum, I’d set the date in the rules. The official announcment is slated for Oct 3:
http://rankexploits.com/musings/2011/sea-ice-bets-bet-on-aug-sept-7-day-minimum/
That gave sufficient time “just in case” Mars attacked and humongo nourmous amounts of ice mysteriously vanish on Sept. 29 and 30, and also permits time for Jaxa to revise– as they occasionally do! (I’ve been avoiding pre-announcing.)
Sweet! I hadn’t noticed that date; looking forward to seeing what comes, barring Martian invasion!