Tuesday, May 11, 2010

Offshore Oil Regulation: Who's To Judge?

As Transocean and British Petroleum try to lower yet another big box to stop the underwater gusher that threatens to turn many Gulf beaches into hazardous-waste environments, President Obama is talking about taking a hatchet to the U. S. Minerals Management Service, the agency that both oversees many aspects of mining and well-drilling, and collects (or is supposed to collect) fees from private entities who have permission to mine or drill on government land, or water. Clear? Well, the conflict that the President sees is that the people who stand to benefit (or at least to make their agency look good) from lots of drilling and royalties derived therefrom, are the same folks who are supposed to play policeman and make sure all this is done safely. While splitting the agency into enforcement and collection halves is a nice idea on paper and gives politicians a sense that they're doing something about the problem, it may just paper over a deeper problem: how do you regulate something that is so complicated that only the people who do it really understand it?

Time and again, reporters have shown how the government regulators of many industries, from petroleum to communications to finance, are either former employees of the very firms they are charged with regulating, or (what seems even worse) rely on the companies they regulate to do the actual inspecting, and take their word that things are going well.

On the surface, this kind of thing looks bad. We all feel that a person who has depended for their livelihood on a particular organization or industry will be prejudiced in favor of that entity, even if the former private employee enters government service to regulate the very business they used to work for. So what is the alternative?

The only way to get rid of all possible prejudice of this kind is to select regulators who have no association whatsoever with oil wells, or radio stations, or banks, or whatever the target of the regulator's scrutiny is. But right away we run into a problem: if you've never drilled a well, or run a radio station, or worked in a bank, can you know enough to regulate it?

Sometimes, maybe so. My father was a banker, and every year or two he'd come home complaining about the upcoming visit of the bank examiners. He never told me what their backgrounds were, but I imagine that back then, a degree in general accounting was probably okay. But if a banker was determined to pull a fast one, it seems like it would be better if the fellow trying to catch him in the act had actually stood in his shoes and learned all the little details of procedure and so on that allow clever nefarious schemes to succeed. It's the old "it takes a thief to catch a thief" idea (no aspersions on bankers intended).

The same problem happens to the nth degree when a highly technical field such as offshore oil production is in question. As I learned long ago when I once thought my Ph. D. in electrical engineering qualified me to fix an oscilloscope, nobody knows a system better than the people who work inside it day to day. So handing my scope over to a technician with a two-year degree and five years of experience fixing just those kind of scopes is going to work a lot better than me trying to fiddle with the thing. There is a lot of what chemist and philosopher Michael Polanyi called "tacit knowledge" out there: stuff that you can't find in books, but which is essential to the proper functioning of machines, systems, and organizations. And nothing teaches tacit knowledge like experience.

This same issue has arisen when questions are asked about why the U. S. government or the Coast Guard or the Marines or somebody with a uniform hasn't been called in to fix the Deepwater Horizon oil spill, instead of letting the same doofuses who broke it in the first place try to do it? The simple answer is that those "doofuses" happen to be the world experts on this kind of thing, and even experts foul up every now and then. Asking the government to shove the private owners aside in order to step in would be pushing away the best expertise we have, and that would be simply stupid.

In the attempts to fix the Deepwater Horizon spill, we may be witnessing the outworkings of a kind of failure that results not just in shifts in government bureaucracies, but fundamental technical changes that render a whole industry safer and better equipped to do its job in the future. Engineer and historian Henry Petroski has shown how certain failures in nineteenth-century iron bridges closed down whole avenues of design and opened up other ones. Despite what they (we?) teach you in school, you can sometimes learn more from failures than you can from success. Once that well is capped, or plugged, or committed to perdition some way or other, and all the hearings are over and the reports written, we will know a lot more about how this accident happened, and how blowout preventers with double and triple backups can nevertheless fail. But the best people to learn this stuff are the very ones who are going to go out and do it better next time. All the government regulators you can hire straight out of school are not going to know quite as much as the experts they regulate, and so the answer is not in simply more regulation, but smarter regulation, and smarter engineering. Let's hope we get both.

Sources: The New York Times carried an article about President Obama's plans at http://www.nytimes.com/2010/05/12/us/12interior.html?hp. Henry Petroski's To Engineer Is Human: The Role of Failure In Successful Design (Vintage, 1992) is still in print, and a good treatment of just what the title says.

1 comment:

  1. Great article! I would like to say two things:

    1) I believe this quote (I do not know the source) sums up the situation very well:

    "The profession of structural engineering has been characterized as the art of moulding materials we do not really understand into shapes we cannot really analyse, so as to withstand forces we cannot really assess in such a way that the public does not really suspect."

    The "in ways the public does not really suspect" part especially summarises the situation.

    The fact is, engineering is complicated and whilst regulation is essential, it is impossible for someone to truly understand all of the intricate modes of operation, failure, interactions and fixes as well as the designer. The public have a right to regulation and protection, but we also have to suspect things going on behind the scenes that we are completely ignorant to. That is why engineering is a profession and it is why we need ethics.

    2) Looking back at failure is absolutely essential. I have really enjoyed Petroski's books and his message is universal. I gave a lecture recently to a first year class on "ensuring safety in design and operations" and one of my closing remarks to avoid creating disasters was "cases, cases, cases".

    Matt B

    ReplyDelete