top of page
Search

Reliability Theory Explained...Part 1

  • srico08
  • Jul 3
  • 1 min read
Robert Lusser, c. 1956
Robert Lusser, c. 1956

People think my podcast is called The Man Who Calculated Death because—well—my grandfather was a Nazi engineer who led a secret project to build a flying bomb. 

Well, that’s wrong. 


It’s actually because he developed an engineering formula that could predict the failure of complex systems—machines with thousands of parts, each one with its own little chance to ruin everything. 


It’s called Lusser’s Law of System Reliability, and this picture is of my grandfather demonstrating the formula to an audience of fellow engineers in the 1950s. Here’s the concept: if one component in a system has a 99% chance of working, that sounds great, right? But chain it to nine others with the same 99%  reliability, and suddenly the system only works about 90% of the time. Multiply that out to 100 components, and you're building a disaster waiting to happen—proven by math. Lusser’s Law doesn't care about your budget or your intentions:  It’s brutally indifferent—and that’s why it works. 


In this 3 part blog post, I'm looking at how this forgotten bit of engineering logic explains some of the world’s most high-profile failures—and what it says about the machines we trust with our lives. I'll start with those people who trusted their lives to a homemade submersible on a trip to explore the Titanic. That’s coming up, in part 2: The Sub That Should Have Stayed Surfaced. (see @suzannerico on Instagram for a video of this post)


 
 
 

Comments


Stay Connected with Us

Thanks for submitting!

bottom of page