Katy Milkman: I’m Katy Milkman, and this is Choiceology.
Newscaster 1: You probably remember the story about the person who thought they had the image of Jesus on a grilled cheese sandwich? Well, now a London man is trying to claim that he found Prince William’s bride-to-be on a jellybean.
Newscaster 2: I see it.
Newscaster 1: Yeah, you see it already?
Newscaster 3: Yes, it’s the famous Illinois-shaped cornflake, plucked from a cereal box by two young women from Virginia.
Newscaster 4: What this man saw as he was driving was enough to make him hit the brakes.
Witness: Nobody’d believe me if I didn’t do this live, but I want you to look at this cloud in the sky. Is that not an angel, or what?
Newscaster 4: It sure does look like an angel.
Katy Milkman: The news and social media feeds are packed with stories of remarkable, unlikely events, surprising coincidences that seem to defy explanation. And while fakery and deception sometimes come into play, many of these stories are real, and really hard to explain. In this episode, we’ll examine a human tendency to attach meaning to incredible coincidences.
This is Choiceology, an original podcast from Charles Schwab. We explore all kinds of decisions, from simple, daily choices to far more significant, life-changing ones. Then we guide you through the hidden psychological forces that influence those decisions for better or for worse. And we do it all to help you avoid costly mistakes.
If you’re of a certain age, you might remember a time when the Bermuda Triangle was regularly featured as a plot point in TV shows and adventure novels.
Narrator: Your course is set to faraway ports, accessible only across the forbidding, lime green waters of ... the Bermuda Triangle. With its sinister, unforgiving mystery cloud, even the ports along the way …
Andy Marocco: Hi, I’m Andy Marocco, and I’m an aviation historian.
Katy Milkman: Andy Marocco is the director of special projects at the Naval Air Station Fort Lauderdale Museum. He has a particular fascination with the Bermuda Triangle.
Andy Marocco: Well, the Bermuda Triangle, it starts at Miami and goes all the way to the island of Bermuda and down to the Bahamas or Puerto Rico.
Katy Milkman: It’s an area of about one and a half million square miles. As you’ve probably heard, quite a few ships and airplanes have gone missing in the area, some without a trace. The mystery of these disappearances has led many to wonder if there’s something strange or sinister about this particular part of the Caribbean. The long list of unexplained disappearances includes the USS Cyclops.
Andy Marocco: In 1918, this Navy ship had a crew of about 309 people on board, and it disappeared. It’s never been found.
Katy Milkman: No distress signal, no response to radio calls from nearby ships. Another example, two planes that suffered a similar, mysterious fate.
Andy Marocco: The Star Tiger disappeared on January 30th, 1948, 25 passengers and six crew members. No bodies or wreckage were found. And the Star Ariel disappeared on January 17th, 1949, and again, no bodies, no wreckage.
Katy Milkman: The planes seemingly disappeared into thin air. There are still more unexplained accidents and disappearances. But one particular story caught Andy Marocco’s attention and captured his imagination. Many consider it the event that cemented the Bermuda Triangle’s reputation as one of the world’s most dangerous and mysterious places.
It’s the story of Flight 19.
Andy Marocco: Flight 19 comprised five Navy Avengers that flew out of Fort Lauderdale Naval Air Station on December 5th, 1945.
Katy Milkman: It was supposed to be a routine training flight, training mission number 19. Five Navy Avenger torpedo bombers took off from Fort Lauderdale, Florida. Five pilots, 14 airmen all told. The flight leader was Lieutenant Charles Taylor. The mission was to fly east to the Bahamas for a bombing drill. The weather report from Fort Lauderdale indicated strong wind gusts and substantial cloud formation. Not long into the flight …
Andy Marocco: Charles Taylor believes that his students are going in the wrong direction, and so what he does is he actually takes over the flight.
Katy Milkman: Taylor takes over command from his trainees.
Andy Marocco: They fly for a couple more minutes, and now he’s starting to hear back from the students that they think he’s going in the wrong direction.
Katy Milkman: At about two hours into the flight, Taylor started to wonder if his compass was malfunctioning. Compasses sometimes give bad readings due to local thunderstorms, but none of the men reported thunderstorms at this point in the mission. This is when things took a turn for the worse.
Andy Marocco: They got lost, and they tried to contact their base. They were able to have a little bit of communication.
Katy Milkman: The five pilots compared their compass readings but they couldn’t agree on a direction, and they were having trouble communicating.
Andy Marocco: Because of the distance that they were at and the frequency they were on, they were limited to what could be heard.
Katy Milkman: It soon became clear. None of the pilots knew where they were. Lieutenant Robert Cox was on a separate flight near the pilots’ home base at Fort Lauderdale. He overheard the worried communications from the pilots of Flight 19. He radioed the group with the suggestion that they fly north. Cox would attempt to meet them by flying south. But for some unknown reason, Taylor ignored the plan. Lieutenant Cox flew south but saw no sign of the Flight 19 planes, and their communications became intermittent and faint.
Several pilots in the formation made a last-ditch request to Taylor to fly west in hopes of finding land. Again, there’s no indication he accepted that plan. The fuel levels in all of these airplanes were dangerously low at this point, and so Charles Taylor made a crucial decision. He directed his pilots to attempt a water landing all together as soon as the first pilot got down to just 10 gallons of remaining fuel. It was five hours after takeoff.
Andy Marocco: The Navy says in their report that at about 7:02 in the evening as the last words from Flight 19, and it was the call sign of FT, which was Fort Lauderdale’s call letters.
Katy Milkman: Flight 19 disappeared. Shortly after that last communication, a Naval air search-and-rescue plane with 13 airmen aboard was sent out to find Flight 19. 20 minutes later, that plane also disappeared.
Andy Marocco: Flight 19 made no mayday or SOS request, at least that we know of or were heard by radio. Flight 19 became lost. They were in communication but ultimately were never heard or seen again. No wreckage was found after that evening, and for several days later, one of the biggest search operations by the Navy happened and didn’t find one shred of evidence, not an oil slick, not any parts, parachutes, canopies, nothing. It is a fantastic mystery.
Katy Milkman: What happened? Six planes disappeared. Where was the wreckage? Where did they end up? How did they get lost? In the absence of evidence about the fate of the planes and their pilots, speculation was inevitable.
Andy Marocco: Well, I think there’s some really interesting theories. Some people believe there’s mysterious powers within this area called the Bermuda Triangle. The lost city of Atlantis, having power in its ability to pull aircraft or ships underwater, or I’ve heard some of aliens and spaceships coming down using tractor beams to pull planes away. I even heard a fantastic story about Flight 19 flying around in ice around the moon.
Katy Milkman: Why are we so intrigued when we hear these stories? Why do they pull us in?
Andy Marocco: We hate to have things not answered in our lives. I mean, it’s about connecting the dots. It’s about finding a path. It’s the same thing with legend. It’s the same thing with mystery. You want a conclusion. You want answers. The problem is, people are more willing to take these stories and believe those than believe the facts that are out there, and I guess that’s just human nature. We believe it’s got to be bigger and better, when in reality, it can be something so simple.
And I think the story of Flight 19 really comes back down to human error. I truly believe that Taylor got lost and disoriented. I believe he led the flight out further east when he shouldn’t have been going east. He should have been going west. Simply put, they ran out of fuel. That’s all she wrote.
Katy Milkman: Of course, this doesn’t explain why no wreckage or oil slicks were found, but remember, the Bermuda Triangle is an area of about one million square miles, and the ocean doesn’t just stop at the end of the triangle. Here’s the mind bender. If you take the actual area of the Bermuda Triangle and lay it over other busy shipping lanes and flight paths, you find that it’s, in fact, no more dangerous than many other equivalent areas.
In fact, it’s actually safer than lots of equally sized triangles.
Andy Marocco: Back in 2013, the WWF …
Katy Milkman: That’s the World Wildlife Federation. They study accidents at sea to track large oil spills.
Andy Marocco: The WWF put together an accident-at-sea report, and they basically came out with the ten most dangerous areas of the sea around the world, and the Bermuda Triangle wasn’t even one of them.
Katy Milkman: Not even in the top ten, despite its reputation. In a way, trying to find a common cause for every disappearance in the Bermuda Triangle makes no more sense than trying to find a common cause for every car accident in, say, Argentina, which is a similar-sized area. But the thing is, we like to see patterns.
Andy Marocco: If you’re looking for patterns in the Bermuda Triangle, I would argue there are none. There are no patterns.
Katy Milkman: There’s a powerful human need to make sense of patterns. After the series of unfortunate events that Andy Marocco listed, I wouldn’t blame you for imaging that the Bermuda Triangle was a particularly treacherous place. But in fact, it’s not. Improbable things happen all the time. That’s how probability works. But humans typically aren’t great at reasoning about probability as they go about their everyday lives. In fact, one of the most powerful insights to come out of the field of behavioral economics is that people are poor intuitive statisticians. 2002 economics Nobel laureate Danny Kahneman wrote about this with his frequent co-author Amos Tversky in some of their most influential research.
Specifically, when we see something that looks a bit out of the ordinary, we too quickly jump to the conclusion that it’s a meaningful abnormality rather than a simple consequence of randomness. You might be wondering, how did Kahneman and Tversky come to this insight? Well, famously, Danny was approached by the Israeli Air Force during the Yom Kippur War and asked to help solve a mystery of two similar squadrons that had left the same base but experienced very different outcomes.
When one returned, it had lost four planes. The other had lost none. The Israeli Air Force asked Kahneman to analyze the data to explain why one squadron had been so much more successful than the other. They noted that, for instance, the men in the more successful squadron had seen their wives more than those in the less successful squadron, and they wondered if that could explain the difference. Kahneman, who had excellent training in statistics, told the Air Force to stop wasting their time. These were small-number statistics, and it was immediately clear to him that they were looking for patterns in the noise.
But to the generals who came to talk to Danny, they really wanted to find order. They wanted to learn from what was actually just bad luck, but without the training as statisticians, they saw a pattern. Kahneman’s later research would go on to prove that this is a basic human instinct that leads to predictable errors in judgment.
We’re going to see if we can demonstrate this tendency to see patterns where there are none. We’ve gathered a few people together to play a little game.
Game Master: And it’s going to go like this. You each have a piece of paper in front of you. What I want you to do is imagine that you have a coin, a quarter, and you’re flipping it. So you’re going to do 20 coin tosses in your head, and I want you to write down the results.
Participant: Like heads or tails?
Game Master: Heads or tails.
Participant: Like how many times we’re going to get heads?
Game Master: So you’re just going to see H, T, H, whatever.
Game Master: And then I’m going to leave the room, and my associate here is going to explain the rest of the experiment. All right?
Katy Milkman: Our game master has left the room, and the participants are busy writing down their imagined coin tosses.
Katy Milkman: His assistant collects these lists.
Assistant: Now I’m going to collect all of the sheets here.
Katy Milkman: And then starts in on the second half of the experiment, 20 real coin tosses.
Participant: Like a real coin toss.
Game Master: So we’re going to do an actual coin toss, now.
Participant: With a real coin.
Participant: Should we just call it in the air?
Game Master: With a real coin.
Katy Milkman: The assistant flips the coin 20 times and writes down a new list and then mixes it in with the lists of imaginary tosses.
Assistant: All right.
Participant: Is that 20?
Assistant: It is 20. Let me just double-check.
Katy Milkman: Finally, the game master returns to perform a mental feat.
Game Master: All right. So using my amazing powers of perception, I am going to determine which of these is the actual coin toss. Is it this one?
Participant: He got it.
Participant: That’s amazing.
Katy Milkman: How did he do that? How did he know which one was the real list?
Game Master: Most people will write down, you know, like heads, heads, tails, tails, heads, tails, heads, heads.
Participant: That’s what I did.
Do a mix, yeah.
Game Master: Do a mix. Real randomness will be bunchy, so you’ll get things like tails, tails, tails, tails, tails, tails, tails, heads, heads. Like that, looks more random.
Game Master: So who did this one?
Participant: I did that one. Yeah.
Katy Milkman: Weird, right?
Game Master: That’s a lot.
Katy Milkman: The experiment shows kind of in reverse that we have this tendency to see patterns where none exist, again. In this case the participants used their intuitions about how randomness should look, and it led them to behave in a predictable way. They figured a random set of coin flips should have very little repetition of the same outcome, so they wrote down lists that looked random to them. And then those lists had a lot of alternating between heads and tails, and not many long streaks of one outcome or the other.
But real randomness is lumpy. Similar events can occur in more bunches than you’d expect by random chance. Real coin flips are streaky sometimes. You might land heads four times in a row. That’s how the experimenter was able to find the real list of coin tosses. It was the one that had some streaks of the same outcome over and over again.
OK, just a quick bit of math. The actual chance that you’ll get three heads in a row if you flip a coin four times is three out of 16, or 18.75%. The chances you’ll get three tails in a row is also 18.75%. If you asked a random person, “What are the chances that if I flip a coin four times that I’ll get the same outcome three times in a row?” Most of them will say a number vastly lower than the right answer, which is 37.5%.
The thing is, often when people see those streaks in what’s just a coin being flipped over and over, they figure they must be connected in some way. It looks weird. It looks fishy or suspicious. This has to do with what scientists call our misconceptions of chance. Basically we think random things should look very random, and when we see anything resembling a pattern, we attribute meaning to it. We think there must be something funky going on in the Bermuda Triangle because accidents seem to have a pattern beyond random chance. But really, randomness is lumpy.
We tend to fool ourselves when we imagine that independent events are related in some way, that after three losses at scratch-and-win we must be due for a winner. I want to explore the reasons we do this, so I reached out to Tom Gilovich, a celebrated professor of psychology at Cornell University and an expert in these kinds of misconceptions of chance. Hi, Tom. Thanks so much for doing this.
Tom Gilovich: Sure, my pleasure.
Katy Milkman: So we wanted to get your take on misconceptions of chance.
Tom Gilovich: Randomness is lumpier than we expect. Statisticians refer to this as the clustering illusion, so if you imagine you got a bag of only yellow M&Ms and another bag of brown M&Ms, and you randomly mix them together and you show them to people, they’re just not going to look random to most people. They’ll go, “There’s a big cluster of brown over there, and there’s another cluster of the yellow ones over here.” That’s what randomness looks like. It’s very clustered and lumpy, more than we expect. So when we see the amount of clustering that chance provides, we reject it and say, “Oh, there’s something systematic going on.”
One of the best applied examples of this, when Apple Computer first came out with their iPod shuffle that would randomly choose songs from your collection of music, people objected, saying, “This isn’t random, I was just listening to the Rolling Stones and now they’re back on again.” Well, again, random selection is going to have more repeats than you expect. So Apple Computer looked at the algorithms and said, “No, these are producing random selections.” They just didn’t seem random to people, so they created an option where you could make your songs more random, and what they meant by that is actually less random but more random-seeming.
Katy Milkman: Great. So they took out the repetition component, or something like that? It wasn’t a true random number generator?
Tom Gilovich: Yes. Yep, yep. It made the selections truly alternate a sample from different bins more often than you get with true random selection. This is part and parcel of a broader tendency of people to structure their world and see more order in it than is actually the case, and there’s a whole perceptual phenomenon that anyone can sample just by going to Google Image. It’s called pareidolia, that when we look at complex visual stimuli, we often see images that aren’t there. A famous example is what’s come to be known as the nun bun. One cinnamon bun produced in a bakery just had a remarkable resemblance to Mother Teresa. And people noted that resemblance and said, “Hey, this is amazing. There must be something mysterious, magical, spiritual going on.” And so it sold for a fair amount of money.
Katy Milkman: I was thinking about the Jesus toast.
Tom Gilovich: The Jesus toast, there are all sorts of, there’s a Kate Middleton jellybean, there’s a certain moment at the billowing clouds of smoke coming out of one of the Twin Towers where it really looks like there’s a demonic face there, and people see the face of the devil, or even the face of Osama bin Laden, by looking at that. And of course, astronomers have known about this. The average person looking at the moon, the Man in the Moon. You see canals on Mars and all sorts of ordered, structured things that, on inspection, aren’t there.
Katy Milkman: Do you mind hitting on why you think this happens? So you’ve talked a lot about orderliness, but if you had to encapsulate what you think is behind this tendency people have to look for orderliness in random numbers?
Tom Gilovich: Yeah, that’s a good question, and I only have the kind of glib, adaptive kind of answers that I don’t find very satisfying.
Katy Milkman: Fair enough.
Tom Gilovich: I used to say, we can do things when we find order. Our job is to find the order in the world and take advantage of it, and so evolution has built this machinery to do just that.
Katy Milkman: Tom, thank you so much for taking the time to talk to me today about this. I really appreciate it.
Tom Gilovich: It was a true pleasure.
Katy Milkman: I’m Katy Milkman, and this is Choiceology, an original podcast from Charles Schwab. One pitfall of seeing patterns where there aren’t any is the potential impact on your investments. You might be tempted to chase recent returns, for instance, overweighting a hot stock instead of sticking with your plan. Our sister podcast, Financial Decoder, digs into these kinds of issues to help you make more informed financial decisions. Mark Riepe hosts the show. Mark is head of the Schwab Center for Financial Research, and he unpacks some of the big financial choices you might be facing so that you’re better equipped to avoid mistakes. You can find it at schwab.com/financialdecoder, or wherever you listen to podcasts.
So now that we’ve talked about misconceptions of chance, you’re probably wondering what we can do about this bias. So first, as usual, you’re already better off because now that you’re aware of this tendency you can look out for it and try to artfully dodge it, though research suggests knowledge is not half the battle in this case. My best advice is to check your gut when you think you’ve found a pattern. If you have a friend who’s good at statistics, you might want to give them a call and tell them what you’re seeing so they can crunch the numbers and see, is it really an outlier or is it just something that you shouldn’t get too excited about?
But let’s get nerdy for a minute and I’ll tell you how I would think about it. Say three people on one of the sales teams at your office performed poorly last quarter and no other team had more than one weak salesperson. You’re tempted to investigate, but you should do the math. What’s the distribution of sales performance? Now that you have that information, you should calculate the likelihood of a team having three poor salespeople if performance were random and not correlated within a team. Was it one in five? One in 100? One in 1,000?
This might sound highfalutin, but it’s an important question to ask, and usually you can use simple statistical tools to answer these kinds of questions. When you see events that had a one in five chance of occurring, don’t get too excited. They happen all the time. One in 1,000 events start to be worth more attention, but it also depends a bit on how hard you’re hunting for a pattern. Another key issue with our misconceptions of chance is that they lead us to see streaks when there aren’t any. You’ve outperformed the market three months in a row—you must be a financial genius. Your last three lottery tickets were losers—you must be due for a big win.
Again, these are actually pretty common outcomes, but they feel uncommon to us because we don’t expect lumpiness when luck is at play. Check the actual odds and remind yourself that things like lottery outcomes are entirely independent events. In spite of your strong intuition to the contrary, your next lottery ticket’s odds of paying out have nothing to do with the performance of your last few.
You’ve been listening to Choiceology, an original podcast from Charles Schwab. If you’ve enjoyed the show, leave us a review on Apple Podcasts. It helps other people find the show. And while you’re there, you can subscribe for free. Same goes for other podcasting apps. Subscribe and you won’t miss an episode.
I’m Katy Milkman. Talk to you next time.
Speaker 15: For important disclosures, see the show notes or visit schwab.com/podcast.