Thursday, November 21, 2019
The Ethical Lessons of Deepwater
The Ethical Lessons of Deepwater The Ethical Lessons of Deepwater For engineers, playing it safe is never the easy way out. Early December 2010 saw the release of two reports issued by groups tasked with deconstructing the deadly and devastating Deepwater Horizon Spill that occurred in the Gulf of Mexico.The Deepwater Horizon Oil Spill and Offshore Drilling commission, appointed by President Obama, and the Deepwater Horizon Study Group (DHSG), formed by members of the Center for Catastrophic Risk Management (CCRM) at UC Berkeley, pointed toward many of the same failures.The DHSGs 60 university professors, accident investigators, petroleum engineers, social scientists, environmental advocates, and directors of research centers did go one step further, however, by directly linking mismanagement by the wells owner, British Petroleum, with its drive for profit. Analysis of the available evidence indicates that when given the opportunity to save time and moneyand make moneytradeoffs were made for the certain thingproductionbecause there were perceived to be no downsides associated with the uncertain thingfailure caused by the lack of sufficient protection.With oil and gas development in the deep waters of the Gulf, Arctic, and other new frontier areas set to continue, the DHSG also contends that the risks of such exploration and production pose likelihoods and consequences of catastrophic failures, that are several orders of magnitude greater than previously confronted.Pondering the Worst CaseThe prospects of failures far more severe are chilling. Yet, Lehigh University Professor John Kenly Smith, a chemical engineer who specializes in the history of technology, believes that forcing stakeholders to ponder the absolute worst is the only way to grapple with whats really at stake. If you are going to work in an environment where its physically impossible to go down there and get your hands on the technology, you really have to think of the unthinkable and nobody wants to do that, says Smith. Ill bet every day on that platform there were engineers thinking, If we have a blowout on this thing what will we do?What have we learned in the months since the worst that could happen, in fact, did? Perhaps not much thats new, says Smith, who believes some of the safety failures that led to the disaster stem from whats all predictably human and imperfect in all of us. Whats also clear is that engineers who design and maintain complex systems are in a tough spot. Here, Smith cites a few lessons of the spill1. Numbers can be deceiving. Theres tremendous pressure in the corporate and scientific worlds to convert uncertainty to risk, says Smith. Take an uncertainty, assign it a probability number then run it through a model to obtain data on how likely a failure might be. The problem, though, says Smith, who during his career in industry investigated a number of serious job-related accidents, is that 999 times, people get away with doing unsafe things, and it s only the 1,000th time that something horrible happens.2. Safety has to be hardwired into a firms SOP. Smith cites the success of companies like DuPontthe subject of a book he coauthored, Science and Corporate Strategy DuPont RD, 1902-1980with rewarding teams with the best safety records. You have to really drill it into people and create counterincentives that make them stop and say Will I cost everyone their prize if I get hurt? A hard-core safety-first stand also can relieve the belastung between line functions that bring in the money and the staff people (i.e., engineers who raise the red flags). This is where ethics come in, says Smith The staff functions and engineers need to have the clout to make themselves heard.3. Simplicity has its virtues (i.e., technical controls can create a false sense of security). The jury is still out on why the Deepwater Horizon blowout preventer failed. Even if results of the investigations lead to future fixes, blind faith in technology can be dangerous, Smith warns When facing a problem theres a tendency to add equipment like a blowout preventer, and think Problem solved. In this case, it didnt work. Additionally, he cautions, adding complexity to a system can inject more ways the pieces of the system can interact and produce unpredictable outcomes.4. Think broadly. As the saying goes Its not enough to guard against the failures that have already occurred. Those that havent happened yet are the ones to fear most. Organizations need to prepare for the unthinkable, and when that happens, go beyond devising ways to keep that particular failure from happening again. Engineers are taught to be problem solvers rather than broad thinkers, says Smith. When something goes wrong, the focus should be on what was the thinking that got us in this stelle?5. Know where you work. Engineers, says Smith, have always faced one central dilemma Are we independent professionals who provide objective assessments based on our training and ethic s? Or are we employees who do what the boss says? Clearly society needs the former, and because of that, he contends, its important to know an organizations history before you join it As a young engineer buried down at the bottom of an org chart, you might not see much or really know what a place is about. But studying up on whos running the company and the values it was founded upon can provide important clues about what to expect when its time to take a tough stand.Marion Hart is an independent writer.999 times, people get away with doing unsafe things, and its only the 1,000th time that something horrible happens.John Kenly Smith, Professor, Lehigh University
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment
Note: Only a member of this blog may post a comment.