Jan 25 2011
The two most obvious examples of overreach by the left have been the government take over of the US Health Care system and the cries of gloom and doom from Al Gore and the IPCC from AGW. Sadly, the former issue has a lot process and economic theory behind it, and it won’t be until people start to feel the pinch of government rationing (a.k.a. death panels) that we see a lot of clear movement in opposition.
But the myth of Anthropogenic Global Warming (a.k.a. human caused global warming) is an easy myth to bust. First off, there has been no warming for over a decade, and we are clearly in a long term cooling pattern. The only ‘warming’ is found in data ‘adjusted’ to smear a few warm spots over massive areas and ignore available satellite data showing NO warming in those areas. These data games are over. Now they just need to be exposed to the general public.
The other nail in the coffin of AGW is people are finally waking up to the uncertainty or error in the data. The CRU hockey team has had to admit publicly there is no way to use tree rings (or other organic proxies) to determine whether the Medieval or Roman warm periods were significantly warmer, cooler or about the same to today’s climate. The fossil and historic record of places like GREENland, however, do clearly indicate there were much warmer periods on Earth than this last decade or two. The ‘decline’ that had to be hidden by the CRU team and Michael Mann was the divergence between tree rings and the current warm period, which introduced enormous error bars onto the proxies. It seems trees truly are poor thermometers.
But we still have the past 130 year land temperature record to deal with. Besides the fact that the raw measurements show no significant warming (only adjusted data shows warming, data adjusted by proponents of AGW), the question is whether there really is sufficient accuracy in the record to detect a sub-degree C increase of the last century, as claimed?
The answer is of course clearly “no”.
We have the smoking gun from CRU itself, their own error estimate showing for any given year the temperature values for their global grids to be off as much as 4°C!
Click to enlarge. This data comes from a CRU report exposed during Climategate. A report which maps nicely to a recent post at WUWT, which discusses in fine detail the errors inherent in the land instrument record.
This post concludes analytically what I have been saying for years – the errors in the data and the integration of the data make it impossible to obtain accuracies below 1°C in any year, let alone over a century. There is also a good debunking of the ancillary myth regarding multiple measurements increasi precision by Willis Eschenbach:
Fifth, the law of large numbers (as I understand it) refers to either a large number of measurements made of an unchanging variable (say hair width or the throw of dice) at any time, or it refers to a large number of measurements of a changing variable (say vehicle speed) at the same time. However, when you start applying it to a large number of measurements of different variables (local temperatures), at different times, at different locations, you are stretching the limits …
I commented at WUWT on this using a hard example from satellites. Satellite orbits decay slowly because (a) the force of gravity dominates their motion due to them orbiting outside our atmosphere and (b) because the remaining forces are so small it takes time for their actions to perturb the theoretical orbital flight path described by Newtonian and Keplerian physics. What this means is in about a week you need to remeasure the orbit to ‘know it’ to a certain level of precision (say a few 10′s of meters).
If we make 20 measurements close in time (less than an hour) from one position we can begin to correct for all the drift and remove most of the error that has built from the various dynamic and random forces over the past week. If we make 20 measurements from 2 different sources over a short time we can really gain precision because the offset geometries cancel out errors along the path of the measurements. This is how more measurements remove uncertainty and gain precision beyond the single measurement.
But this requires a few pre-requisites to work. First off, the system being measured must be stable or very slow in its dynamics. This is of course true for satellites over an hour, but not true for a local temperature over a day. Fronts come through, rain or sunshine, seasons, etc all make temperature very dynamic and non-linear.
Second, it requires the measurements to be taken close in time (temporally). If we take 20 measurements of a satellite’s position over 20 days instead of an hour, we will never resolve the orbit of a satellite. Never. That is because the measurements are not close enough in time to drive out uncertainty.
This is one of the foundations of AGW theory. And it is (as usual) wrong.
I cannot wait to see if the GOP can put experts up in committee hearings and let them explain the numerous errors and unfounded claims of the AGW ‘science’ – because it is barely science at all. There is a reason a good chunk of the ‘skeptics’ are engineers. We are the ones who take scientific concepts and make them real. While a scientist may have a theory about what can be learned by measuring the ultraviolet light from distant galactic bodies, it is the engineer that makes it possible to build the instrument to fly in space to take the samples that feeds the theoretical equations. It is the engineer that always brings the scientist back to Earth.
Sadly, pols are so math and science challenged they never really grasp what this is all about, and we end up drawing conclusion based on the dumbest person in the room – not the most capable or experienced. Not to mention all the poseurs out there wrapped in their PhDs who barely get the difference between theory (fantasy) and reality.
When everyone is special, no one is. Which means a pol can challenge an expert with a rare combination of ignorance and arrogance, and no one sees a problem.