Quotulatiousness

July 3, 2023

Nuclear power

Filed under: Books, Bureaucracy, Government, History, Science, Technology — Tags: , , , , — Nicholas @ 05:00

One of the readers of Scott Alexander’s Astral Codex Ten has contributed a review of Safe Enough? A History of Nuclear Power and Accident Risk, by Thomas Wellock. This is one of perhaps a dozen or so anonymous reviews that Scott publishes every year with the readers voting for the best review and the names of the contributors withheld until after the voting is finished:

Let me put Wellock and Rasmussen aside for a moment, and try out a metaphor. The process of Probabilistic Risk Assessment is akin to asking a retailer to answer the question “What would happen if we let a flaming cat loose into your furniture store?”

If the retailer took the notion seriously, she might systematically examine each piece of furniture and engineer placement to minimize possible damage. She might search everyone entering the building for cats, and train the staff in emergency cat herding protocols. Perhaps every once in a while she would hold a drill, where a non-flaming cat was covered with ink and let loose in the store, so the furniture store staff could see what path it took, and how many minutes were required to fish it out from under the beds.

“This seems silly — I mean, what are the odds that someone would ignite a cat?”, you ask. Well, here is the story of the Brown’s Ferry Nuclear Plant fire, in March 1975, which occurred slightly more than a year after the Rasmussen Report was released, as later conveyed by the anti-nuclear group Friends of the Earth.

    Just below the plant’s control room, two electricians were trying to seal air leaks in the cable spreading room, where the electrical cables that control the two reactors are separated and routed through different tunnels to the reactor buildings. They were using strips of spongy foam rubber to seal the leaks. They were also using candles to determine whether or not the leaks had been successfully plugged — by observing how the flame was affected by escaping air.

    The electrical engineer put the candle too close to the foam rubber, and it burst into flame.

The fire, of course, began to spread out of control. Among the problems encountered during the thirty minutes between ignition and plant shutdown:

  1. The engineers spent 15 minutes trying to put the fire out themselves, rather than sound the alarm per protocol;
  2. When the engineers decided to call in the alarm, no one could remember the correct telephone number;
  3. Electricians had covered the CO2 fire suppression triggers with metal plates, blocking access; and
  4. Despite the fact that “control board indicating lights were randomly glowing brightly, dimming, and going out; numerous alarms occurring; and smoke coming from beneath panel 9-3, which is the control panel for the emergency core cooling system (ECCS)”, operators tried the equivalent of unplugging the control panel and rebooting it to see if that fixed things. For ten minutes.

This was exactly the sort of Rube Goldberg cascade predicted by Rasmussen’s team. Applied to nuclear power plants, the mathematics of Probabilistic Risk Assessment ultimately showed that “nuclear events” were much more likely to occur than previously believed. But accidents also started small, and with proper planning there were ample opportunities to interrupt the cascade. The computer model of the MIT engineers seemed, in principle, to be an excellent fit to reality.

As a reminder, there are over 20,000 parts in a utility-scale plant. The path to nuclear safety was, to the early nuclear bureaucracy, quite simple: Analyze, inspect, and model the relationship of every single one of them.

No Comments

No comments yet.

RSS feed for comments on this post.

Sorry, the comment form is closed at this time.

Powered by WordPress