Probability Series #2: The Bombers that Never Came Back

By TradeSmith Research Team

Listen to this post

In an earlier probability series, we wrote about the medical doctor who beat roulette.

The lesson was that, if you can learn how to “put the odds in your favor” — to make use of probability, in other words — you can possibly find ways to consistently win where others lose.

And while investing is not a “game of chance” like roulette, the discipline of investing is very much about probability (though a lot of investors don’t realize it).

Probability can be a tricky topic. Many aspects of probability are not “intuitive,” meaning many people won’t guess the truth on their own.

Other aspects of probability can be trickier still, or even downright deceptive — to the point where experienced professionals assume the wrong thing (until they learn probability’s secrets).

A powerful example of this came not from a game of chance, but a matter of life and death: the survival rate of American pilots in World War II.

The United States entered World War II after the bombing of Pearl Harbor on December 7, 1941.

About seven months after that — on July 1, 1942 — the Statistical Research Group (SRG) was formed at Columbia University in New York City.

The purpose of the SRG was to serve the statistical needs of the U.S. war effort. America needed to understand probability and statistics to make its best war decisions, and the SRG focused on that.

One of the problems the SRG had to solve was how to minimize losses for American bomber planes.

The challenge with a bomber is using enough armor plating to protect the plane from enemy fire … while not using so much that the plane becomes too heavy to fly.

It’s possible to get the answer wrong in multiple ways. If you don’t use enough armor, too many planes are shot down by enemy fire. If you use too much, on the other hand, either the plane can’t fly at all or it uses too much fuel (in a time of war when fuel rations were precious).

One of the top minds at the SRG was a brilliant Hungarian mathematician named Abraham Wald.

In 1931, Wald earned a PhD in Mathematics from the University of Vienna. In 1938, when the Nazis invaded Austria, Wald came to the United States (and wound up at the SRG).

Wald was asked to examine the bomber problem. More specifically, the SRG wanted Wald’s recommendations on how to better protect U.S. bomber aircraft from enemy fire.

The SRG analyzed American bombers that returned home after flying missions over enemy lines. They recorded the frequency of bullet holes and the placement of bullet holes on the planes. Then they created diagrams showing where the planes tended to get hit most often.

Based on the bullet-hole data, the U.S. military had a strong hunch as to what should be done. (The Air Force didn’t yet exist as a separate branch; it was formed in 1947.)

Military officials thought extra armor plating should be applied to the areas of the plane with the most bullet holes. Most people would make the same assumption.

After all, it’s pretty logical, right? If armor plating can only be used sparingly on a plane (because of weight and fuel issues), then more armor should go in the places that took on heavy fire.

Fortunately, for the pilots’ sake, Wald understood probability … and he pointed out why this line of reasoning was backward.

Wald explained to military officials why extra armor should not be placed on the areas with the most bullet holes. In fact, it should be placed in the areas with no bullet holes at all!

At first, this seems crazy. Why would you add extra plating to the areas of the plane that showed little to no damage, while ignoring the areas that took heavy fire?

But Abraham Wald — being the brilliant guy that he was — understood the importance of thinking in probabilities … and visualizing the problem in a different way.

Wald wasn’t focused on the aircraft that returned home to participate in the study. He was thinking about the planes that never came back at all.

It was reasonable to assume that bombers flying over enemy lines would receive an equal distribution of enemy fire. There would be bullet holes in all the places where a plane can logically get hit.

The planes that got home safely, even while taking heavy fire in certain areas, thus carried a hidden message in their fuselage: These are the parts of the plane that can get hit and the plane will survive.

The areas of the plane that showed little or no damage, however, broadcast another hidden message: The planes that took heavy fire in these places never came back at all.

It was the aircraft that Wald and the U.S. military officials couldn’t see — because they never made it home — that deserved the most consideration.

And for the planes that did make it home, it was the areas with the fewest amount of hits (or none at all) that provided a clue as to where the planes were most vulnerable.

This paradoxical idea also applies to investing.

If you look at a winning investor’s portfolio, you might see nothing but winning stocks — names that have been trending higher in a healthy way for months or even years.

This might seem like a miraculous feat. An investor who only picks winning stocks! Is this some kind of genius or wunderkind?

Probably not, if you think about the stocks that are no longer in the portfolio … because the investor (wisely) sold out of the subpar investments quickly, before they had a chance to hurt.

Understanding probability can help us think differently about markets, and help guide us toward smarter decisions in cases where the right thing to do isn’t obvious — as Abraham Wald and SRG did in World War II.

TradeStops was developed with that sort of “irrational” logic firmly in mind. Our algorithms work quietly beneath the surface, evaluating stocks for indicators of investments coming back after taking fire.

And equally as important, we monitor for those that didn’t come back, so we can help provide the right kind of armor to your portfolio, in the right places … so you can potentially make the most, while risking the least.