The Peltzman Effect: Don’t Get Run Over


Road signs in Mistissini, Quebec, showing stre...

I was driving to work one day and thought,
Me: "Self?"
Self: "Yes?"
Me: "You haven't written a blog post in a while."
Self: "Yeah" (He's not real talkative)
Me: "What's up with that?"
Self: "Dunno"
Me: "Let's try to think of something by the end of the day, mmmmKay?"
Self: "Uh-huh."

Not two minutes later my car and I were at a four-way stop intersection, where I spotted a woman walking with her young child and dog on the sidewalk to my right. She didn't bother to even look at me as she walked directly in front of me with her kid and dog in tow. Then things got real interesting when a car coming from the opposite direction entered the intersection early, right in her path. She didn't take notice until it was almost too late, but thankfully her reflexes took hold and she pushed everyone out of harm's way. It all seemed to happen in an instant. As the negligent driver rolled on by she let out an impressive string of curses.

Now, at this juncture the righteous would say, "that driver was a maniac," the cynical would say, "that woman must have spent time in the navy." But the true safety engineer would say, "The Peltzman Effect." Named after famed economist Sam Peltzman (University of Chicago), it is the paradoxical result that rules and regulations meant to create a safer environment can lead to more risky behavior, which offsets the beneficial impact of the original regulation.

When More Safety Becomes Less Safety

Anybody who's spent any time in process industry knows this to be absolutely true at locations where good safety culture is not engrained. The classic example is the guy who performs a filling routine a thousand times a year by a specific routine. For years and years he follows the procedure properly until one time he screws up and overflows the fill tote. So a lot of time and money is spent analyzing the incident and the onsite safety professional says, "A-ha! We need to add an alarm on the fill tote that warns the operator that the level is getting near full." Hugs and high-fives to all because the problem is solved.

But it isn't.

Why not? Because in a poor safety culture the operator now knows he has backup. And if he is not disciplined, and if he is not trained to understand risks and hazards, he will ignore the procedure and "just fill till the red light and beeper go off." So in essence the safety professional has just traded a very reliable operator for a very unreliable instrument.

Don't let Peltzman get the best of you. In the case of our near-flattened mom, she saw the stop signs, assumed all the drivers in the intersection were vigilant and sober, and blazed on ahead the way she probably had done a thousand times before. But in essence she traded away her own judgment and capability to recognize danger; in its place she simply put her faith in whoever randomly pulled up to the intersection. That is scary. People are dumb; while driving we eat, text, talk on the phone, put on makeup, look for that Carpenters song on the iPod, tweet about our latest blog post, and who knows what else.

It Begins and Ends with You

Rules, regulations and safety laws are just one of the many safeguards that help you in life. But never ignore the most important safeguard that prevents you from harm - yourself.

Hear Sam Peltzman speak about the Peltzman Effect.

What's the latest example of the Peltzman Effect you witnessed?

Comments

Aurian's picture

As a safety eng, I see Peltzman working his magic on almost every project, so this post is pretty relevant. Interesting to note, though, it that the effects aren't only evident in hands-on operations. In a design environment that has multiple engineering checks, it might be easy to fall into relying on the system of hierarchical/matrix sign-off to ‘catch’ any discrepancies. Assuming that 'the system' designs out any engineering errors is just as dangerous as assuming that the alarm will go off when the tank is full.

Peter's picture

That's a great point Aurian! Absolutely true! And the degree to which it happens depends on the work culture, as well as personal habits.

Martin (Basel)'s picture

Peter, Nice article. In economics there&#039;s a parallel theory on the impact of insurance on behavior called moral hazard. See the wikipedia explanation: <a href="http://en.wikipedia.org/wiki/Moral_hazard" rel="nofollow">http://en.wikipedia.org/wiki/Moral_hazard</a>

Peter's picture

Great comment and excellent insight, Martin. It&#039;s funny - I&#039;ve seen the exact thing occur about the insured party no longer believing themselves accountable - when my wife was in the hospital after our third was born the nurses encouraged her to order steak and lobster off the menu because &quot;it&#039;s free.&quot; Moral hazard, indeed!

Rich Byrnes's picture

Peter, Great illustration of the Peltzman Effect with the tote filling alarm system discussion.&nbsp; Perhaps an underlying root cause to the overfill could be the Operator is being tasked with too many &quot;manual&quot; operations in the field, therefore the human failure rate could approach 10^-1 (much worse than standard instrumentation).&nbsp; This could happen even in a company with a good safety culture, since as processes naturally evolve over time through incremental change, judging operator total task load to discern potential &ldquo;overload&rdquo; may be missed by the individual smaller Change Proposals or MOCs.&nbsp; In such a case I would suspect a well maintained, routinely calibrated and tested, automated filling system with an independent high level alarm &amp; safety shutdown (SIS/SIL perhaps if required) would be effective at keeping Peltzman at bay? Is the &quot;mouse trap&quot; then getting too complicated for its own good? Thanks for the post. Rich