Tuesday, November 4, 2008

When expectations are blinded by education, or why professionals make life harder

 Back in the 70's when nuclear was a much more intriguing concept for the general population as a means of energy production, someone was designing the controls for the reactors at three mile island. Considering the relatively simplistic design of the reactor, the team desiging the controls felt no special precautions were required to explain the operation and effect those operations would have on the reactor. They considered that, since the Babcock and Wilcox reactors were so stable and safe, the operators never needed to touch the controls.

 This meant anyone could run the plant. Anyone, apparently, included high school graduates with little to no training on the reactors. They were told they didn't have to worry about the controls, and that the reactors could pretty much run themselves without the controllers interference.

 This was obviously wrong. while the specifics of controlled fission are somewhat simple, the operations in place to control the fission of the materials and the resultant cooling systems are fairly complex. For instance, it is sometimes necessary to control the depth of the control rods to increase or decrease the reaction volatility. Failure to do so could potentially cause he reactor to meltdown. 

 In one famous case involving the control rod depth controls, the technicians placed two beer cans, both of different brands on the control levers so they would be able to tell which controlled the depth of the rods.

 let me clarify, this didn't cause the accident, but it is indicative of the thoughts of the design engineers.

 They were highly trained professionals who understood every facet of their jobs, but failed to understand that the people who were going to run the reactor did not necessarily have that same understanding. On march 28, 1979, a small event caused a pressure release valve to release, and through an error in the warning system, to not close. The valve allowed the escape of coolant in the TMI-2 reactor, and the system overheated. this may have been avoided, had it not been for the deluge of information given to the operators, that was largely irrelevant or erronous. The reactor had a partial meltdown, and the rest is history.

 This story illustrates a simple fact; well informed and well educated people can be ignorant of the fact that the people they are building, designing, or repairing their systems or devices for are not always as informed of the particulars as they are. 

 The explanation for these errors in  judgement are many, whether it be that knowledge is power; giving more information then is required to deal with the problem at hand causes confusion for the unprepared, while the informed can use that large amount of information to their advantage. Arrogance can cause it as well, the belief that the operators should understand the systems they design, no matter how complex. Or, just overestimating the experience and training that the user will have in dealing with the system.

 To properly address and avoid the problem such assertions create, has been readily addressed, and made seemingly complex operations imple for the user. A good example was the advent of MS-DOS, Bill Gates realized that the system was exactly what the computing industry needed at the time, an accessible, capable, and widely accepted single format for widespread use. This was a revolution for the industry that had up until that point been using several proprietary formats of OS to address the problem of access.

 Simply put, a lack of information both causes problems, and creates them, even when the creation is due to an excess of information.


FBOMB said...

did you read this somewhere and copy and paste it or did you come up with this on your own?

Scott Drouin said...

No, this article was my own invention.