Nov 25 2009
Update: Another SW engineer concludes the CRU code shows clear data manipulating to cool off the 1940’s warm period and warm up the currently cooling period. – end update
I can now understand why Jones and Co. were so resistant to providing code and data – it would not take long for the army of skilled skeptics with backgrounds in science, engineering, math and programming to unravel the truth. And now that the dirty laundry is in the hands of 100’s of sharp minds on the internet, were are discovering the depth and breadth of the AGW con.
I have been off work for a couple of days now so as to wade through all the details, but in this race there are too many good people doing the good work, so I will link to those who are credible in their analysis.
I have seen inklings of how bad the CRU code is and how it produces just garbage. It defies the garbage in-garbage out paradigm and moves to truth in-garbage out. I get the feeling you could slam this SW with random numbers and a hockey stick would come out the back end. There is no diurnal corrections for temperature readings, there are all sorts of corrupted, duplicated and stale data, there are filters to keep data that tells the wrong story out, and there are create_fiction sub routines which create raw measurements out of thin air when needed. There are modules which cannot be run for the full temp record because of special code used to ‘hide the decline’.
For details on some of the smoking guns found in the CRU code check this out:
I ran across warns that the particular module â€œUses â€˜correctedâ€™ MXD – but shouldn’t usually plot past 1960 because these will be artificially adjusted to look closer to the real temperatures.â€
What exactly is meant by â€œcorrectedâ€ MXD, you ask? Outstanding question — and the answer appears amorphous from program to program. Indeed, while some employ one or two of the aforementioned â€œcorrections,â€ others throw everything but the kitchen sink at the raw data prior to output.
For instance, in subfolder â€œosborn-tree6\mann\oldprogâ€ thereâ€™s a program (Calibrate_mxd.pro) that calibrates the MXD data against available local instrumental summer (growing season) temperatures between 1911-1990, then merges that data into a new file. That file is then digested and further modified by another program (Pl_calibmxd1.pro) which creates calibration statistics for the MXD against the stored temperature and â€œestimatesâ€ (infills) figures where such temperature readings were not available. The file created by that program is modified once again by Pl_Decline.pro, which â€œcorrects itâ€ â€“ as described by the author — by â€œidentifying and â€œartificiallyâ€ removing â€œthe decline.â€
But oddly enough â€“ the series doesnâ€™t begin its â€œdecline adjustmentâ€ in 1960 — the supposed year of the enigmatic â€œdivergence.â€ In fact, all data between 1930 and 1994 are subject to â€œcorrection.â€
“Correction”? Adds a whole new meaning to the phrase “Politically Correct”. I am now very positive the cover up will be proven by the code. The very code Phil Jones and others were willing to destroy before making public.
And one can truly understand those feelings.
After all, who wants to go down in history as the folks who went to jail for the scientific scandal of the century? Sort of reminds me of Galileo’s challenges when he defied the High Priests of his time with real science and showed the Earth was not the center of the Universe. Seems CRU and IPCC are not the center of all intelligence on Earth either.
Update: As I noted in my earlier post the cover up to make AGW real meant CRU had to reduce the ‘1940’s blip” which is seen world wide in the CRU temp data. This period is as warm or warmer than today. One way to take raw temp data (as shown in the previous post) and squelch the temperatures is to run tree ring proxies over it (Keith Briffa’s apparent job). It seems tree rings alone were not enough, as Marc Shepard at American Thinker (linked above) discovered:
In 2 other programs, briffa_Sep98_d.pro and briffa_Sep98_e.pro, the â€œcorrectionâ€ is bolder by far. The programmer (Keith Briffa?) entitled the â€œadjustmentâ€ routine â€œApply a VERY ARTIFICAL correction for decline!!â€ And he/she wasnâ€™t kidding. Now, IDL is not a native language of mine, but its syntax is similar enough to others Iâ€™m familiar with, so please bear with me while I get a tad techie on you.
Hereâ€™s the â€œfudge factorâ€ (notice the brash SOB actually called it that in his REM statement):
valadj=[0.,0.,0.,0.,0.,-0.1,-0.25,-0.3,0.,-0.1,0.3,0.8,1.2,1.7,2.5,2.6,2.6,2.6,2.6,2.6]*0.75 ; fudge factor
These 2 lines of code establish a 20 element array (yrloc) comprised of the year 1400 (base year but not sure why needed here) and 19 years between 1904 and 1994 in half-decade increments. Then the corresponding â€œfudge factorâ€ (from the valadj matrix) is applied to each interval. As you can see, not only are temperatures biased to the upside later in the century (though certainly prior to 1960) but a few mid-century intervals are being biased slightly lower. That, coupled with the post-1930 restatement we encountered earlier, would imply that in addition to an embarrassing false decline experienced with their MXD after 1960 (or earlier), CRUâ€™s â€œdivergence problemâ€ also includes a minor false incline after 1930.
So we have Â the smoking gun. Here we have the hardcoded values showing how you can turn this (green box highlights the 1940’s blip):
Into this, where there is only a shadow of the 1940’s blip left:
The data is hard coded right there! The early negative numbers push the curve down in the 1930-40 period, and the later positive numbers shift the current day temps up high, making it LOOK like today is significantly warmer than 6 years ago (which the raw CRU data does not show!).