Nov 25 2009
There is a well known problem when research scientists have to interact with the real world: they hate to be bothered with rules, regulations, laws, etc. At NASA the lowest quality code you will find is in the science processing chain. It is just not held to the same quality standards as the operational code the runs all the flight and ground HW.
For example: you can’t have PhD quality code launching massive rockets over this countries large cities running up the eastern coast. You can’t have PhD level code controlling large antennas, since you don’t want them be destroyed as you move these massive machines or fry someone by turning them on when people are working on them. You don’t want PhD quality code landing the Space Shuttle. It’s that simple and no NASA PhD will argue that point.
The truth is the science teams don’t get enough funds to do it right, but that is only half the problem. The other half is the scientists like to write crappy code only they can use – creates a lot of job security. For much of science this is a livable and reasonable arrangement. Let the PhD’s dabble in exploring the unknown, and leave the designing, operating and safety of large complex systems (which can kill lots of people if things go wrong) to lesser people – like engineers.
When the global warming canard migrated from niche research into trillions of dollars of policy changes effecting every human being on the planet, the PhD level of quality control should have been ejected immediately. With the fate of humanity at risk, it is not too much to ask for professional quality code, analysis, and a true peer review process. Not that silly science journal review process by the good ol’ PhDs network, a real review like we do when we launch people into space or build a rail system or a new airplane.
I was struck by this excellent review of how the global warming alarmists were reacting to the scientific community who were attempting to check their work, many of whom have to deal in the real world (a slightly updated version is available here). Some of the emails disclose just how unprepared these so called scientists were for the real world and its demands for quality and safety. It indicates an almost child like inexperience. This one caught my eye and caused me to write this post:
Thanksâ€ Ben for this, hi all and happy new year. I had a similar experienceâ€“ but not FOIA since we at Climatic Change are a private institution- -with Stephen McIntyre demanding that I have the Mann et al cohort publish all their computer codes for papers published in Climatic Change I put the question to the editorial board who debated it for weeks. The vast majority opinion was that scientists should give enough information on their data sources and methods so others who are scientifically capable can do their own brand of replication work, but that this does not extend to personal computer codes with all their undocumented sub routines etc. It would be odious requirement to have scientists document every line of code so outsiders could then just apply them instantly. Not only is this an intellectual property issue, but it would dramatically reduce our productivity since we are not in the business of producing software products for general consumption and have no resources to do so. The NSF, which funded the studies I published, concurredâ€“so that ended that issue with Climatic Change at the time a few years ago.
Not to mention no ability to do so. To develop high quality, bug free and error free code takes a lot of infrastructure and training. And it does require documenting code so that teams of reviewers can sift through it and make sure there are not bad surprises. You don’t want to hit a SW “feature” in your car’s computer system as your racing 70 MPH through a turn. You also hire testers who run the code in all sorts of configurations to try and detect errors. And if humans could be hurt you hire independent reviewers and testers.
This is the real world, when hypothesis has to be implemented. This is where the alarmists can’t go and can’t be bothered. Check out this lame excuse from your classic elitist PhD:
You and I have spent over a decade of our scientific careers on the MSU issue, Tom. During much of that time, weâ€™ve had to do science in â€œreactive modeâ€, responding to the latest outrageous claims and inept science by John Christy, David Douglass, or S. Fred Singer. For the remainder of my scientific career, Iâ€™d like to dictate my own research agenda. I donâ€™t want that agenda driven by the constant need to respond to Christy, Douglass, and Singer. And I certainly donâ€™t want to spend years of my life interacting with the likes of Steven McIntyre.
I hope LLNL management will provide me with their full support. If they do not, Iâ€™m fully prepared to seek employment elsewhere.
This poor, lazy PhD cannot be bothered, so he will quit. He should quit. He doesn’t have what it takes to develop something for real. It takes sacrifice, painful review and testing. It takes being challenged day in and day out so that what comes out is near bullet proof. It takes sweating the little stuff and attention to mind numbing detail. It takes a team because no one person can do all or see all. It takes climbing down off one’s ego-throne and doing what it takes to make a high quality, reliable and safe product.
The alarmist PhD may look impressive to those journalist majors who went into the Lame Stream media pretending they could grasp the complexity of human endeavors, but to those of us doing the endeavoring the CRU group look like a bunch of rank amateurs who got busted breaking the law trying to cover up their shoddy products.
As the author of the piece notes, if the alarmist ‘scientists’ acted like real scientists and just put out their data and methods (and produced even quasi professional code) they wouldn’t be wasting their lives defending their mistakes. Science and engineering and law are all adversarial systems which provide a crucible from which truth and real solutions can emerge. Failure to go through the crucible is not a trick – it is just failure.