"Stop, Think, Don't Do Something Stupid" is the advice that Dr. Robert Bea, Professor of Engineering at the University of California Berkeley, tries to reinforce to his engineering students. This is the same advice that he would have suggested to BP and Transocean a few months ago. Dr. Bea, who was interviewed on 60 Minutes this last May watch entire video, has investigated the Columbia Space Shuttle disaster for NASA, the Katrina disaster for the National Science Foundation, as well as investigating numerous other oil rig disasters. He is currently investigating the Deepwater Horizon Blowout at the request of the White House.
Bea painted a picture of two managers having a discussion/argument on the deck of the Deepwater Horizon about the procedure to cap the well--before the disaster occurred. The BP manager wanted to remove the heavier "mud" prior to capping the well, while the Transocean manager wanted to leave the "mud" in place. The heavy "mud" could act to reinforce the cement plugs keeping the pressurized oil from gushing to the surface. BP prevailed, and the "mud" was removed, even when they had the knowledge that the annular rubber rings in the BOP (Blow Out Preventer) might have been damaged, AND that one of the two control pods was not properly functioning. The rest is history.
Bea believes that BP took this shortcut in order to save time when it came time to activate the well. Eventually the "mud" would have to be removed for actively using the well, and Bea felt that BP wanted to remove the "mud" as early as possible to save time--and money-- down the road.
Mechanical systems failed--but once again it was the human decision making process that was the ultimate, and preventable cause. I will guarantee that BP and Transocean had volumes of safety procedures, as well as standard operating procedures. Unfortunately it all boiled down to a human decision, made under time and money contrainsts, resulting in a gamble to take shortcuts. When you gamble big, sometimes you lose big.
Space Shuttle Columbia, Katrina, BP --- all underpinned by a huge technological database of information, all subverted by the human decision making process. The result of BP's Deepwater Horizon Blowout will be the usual legislation, laws, standards, and regulations that always follow a disaster. But the next black swan, random disaster will be brand new, and likely not covered by any newly enacted legislation. Instead, I propose, that we need to start paying attention to the "humaness" that makes future disasters inevitable. The bad news is: as our society becomes more complex, these disasters impact more people, impact more of the environment, and cost billions of dollars that could be spent on more important things. The good news is: we each have the abilities within our reach to fine tune our mental models for decision making, and hopefully avoid future catastrophes. We need to learn to control our hard wired humaness that encourages us to take these disastrous mental shortcuts in our thinking process.
The BP disaster was not a failure of technology--it was human thinking patterns.
We can do better!
Dr. Bea is an engineer that investigates the technical aspects of these disasters. I am going to try to contact him, to see if I can spark some interest in him to investigate the "humaness" aspect of disaster prevention. Hopefully, by improving the technology systems along with our decision making skills, maybe we can start preventing future disasters.