Katy Milkman: Spartacus, a name that rings through the ages. The most famous gladiator from ancient Rome. Spartacus was a Thracian soldier who was sold into slavery. As a slave, he trained as a gladiator in Capua, at the first gladiator school. There, he competed in harrowing battles, all to entertain Roman elites.
Spartacus survived those battles and eventually escaped captivity. He went on to lead a major slave uprising against the Roman Republic. His strength and bravery would inspire political and military leaders for generations to come.
The life of any gladiator was harsh. The fact that these combatants fought and killed each other as entertainment makes the gladiator an enduring symbol of the excesses of ancient Rome. And there were thousands of gladiators over hundreds of years who came from the Roman Republic and Empire. So now, I want you to take a minute and try to think of another gladiator, someone other than Spartacus.
Unless you're a serious Roman history buff, it's kind of hard to do, isn't it? In this episode of Choiceology, we're going to examine a bias that trips up our thinking about success and failure—and everything from how we raise our kids to how scientists once studied extra-sensory perception.
I'm Katy Milkman, and this is Choiceology, an original podcast from Charles Schwab. It's a show about decisions and the impact those decisions have on our lives. It's also a show about the subtle but systematic mistakes in reasoning that can push us in one direction or another, often without us even realizing it. We try to give you some tools to fight back against those forces and to help you avoid costly errors.
Speaker 2: Quiet, please, quiet. Now if you would please place your hands on the table, Mrs. Van Dyke, lightly, just the fingers touching.
Speaker 3: Mr. Lane, we are so excited.
Speaker 4: Please, Mrs. Van Dyke, the professor must have complete silence.
Speaker 3: I'm sorry, but what's he doing?
Speaker 4: Shh, he's going into his trance.
Speaker 3: Dr. Langley?
Speaker 4: Shh, he's contacted the outer circle.
Speaker 2: I feel your presence. Who are you?
Katy Milkman: That's a scene from an old radio drama, depicting a séance. In the 1920s, shortly after the First World War, interest in spiritualism was on the rise. Spiritualism was the idea that people could communicate with the dead.
Linda Rodriguez McRobbie: So at the time, spiritualism had been around in America for about 60 years. It got its start after the Civil War, and got a big boost after World War I.
Katy Milkman: This is Linda.
Linda Rodriguez McRobbie: Hi, my name is Linda Rodriguez McRobbie.
Katy Milkman: Linda is a journalist whose work regularly appears in the Boston Globe, Smithsonian Magazine, and other outlets, and she's written about a fascinating chapter in the history of scientific inquiry that begins in the 1920s and features a young American botanist named Joseph Banks Rhine.
Linda Rodriguez McRobbie: What was happening in science was just this phenomenal uncovering of the kind of invisible forces that were connecting our universe and making it possible. I mean, if atoms were a thing and microbes were a thing, why couldn't there be life after death? If all of these sort of invisible, unseen things were happening around us all the time, then perhaps there were some answers and some explanations that we just hadn't uncovered yet, and that could potentially link even science and religion, that could link these experiences that people sort of felt like they were having to actual scientific study.
Katy Milkman: Rhine understood that science had the potential to reveal hidden phenomena in the natural world. But then Rhine and his wife, Louisa, had an encounter with a famous author who had a profound influence over the direction of Rhine's studies.
Linda Rodriguez McRobbie: So in 1922 while J.B. Rhine was pursuing his PhD, they saw Sir Arthur Conan Doyle speak about spiritualism. Doyle at the time was on his lecture tour, and he was incredibly well known as the creator of Sherlock Holmes. And this is had sort of established him not only as a popular figure but also as a figure of reason. But he'd also had a lifelong interest in the mystical, in things that weren't quite able to be explained by science just yet.
Katy Milkman: Sir Arthur Conan Doyle believed that the living could communicate with the dead, and even participated in séances himself in an attempt to communicate with his son Kingsley, who died in World War I.
Linda Rodriguez McRobbie: He was coming to America with a message of spiritualism, riding on his popularity as the creator of Sherlock Holmes. This message that he was bringing to the world was that psychic ability could pierce this sort of veil that separated the living from the dead. And that, to some degree, this could be studied by science. So in 1922 the Rhines go to see Sir Arthur Conan Doyle speak, and it was a tremendously exhilarating experience for J.B. Rhine. It was almost as if it cracked open a door for him that he didn't know was possible, and all of a sudden all those questions he was hoping to have answered by science and biology seemed to sort of coalesce in the potential study of what happens after we die.
Katy Milkman: So J.B. Rhine began to think that the scientific method might provide insight into the mysteries of life after death. But being a scientist, Rhine still took a skeptical approach to unexplained phenomena.
Linda Rodriguez McRobbie: Rhine and his wife went to visit Mina Crandon, the spirit medium who was incredibly popular in Boston, and they entered this dark and stuffy parlor of her apartment. They were all asked to sit down at a table and hold hands, while Mina was going to communicate with the spirit of her dead brother, Walter.
Over the course of the evening, Rhine witnessed Mina supposedly channeling the spirit of her brother, ringing a bell, luminous objects floating through the air, the table would shake. There was a spirit hand produced at one point, a sort of dead, horrible thing that touched people. And Rhine walked away from this experience feeling very, very skeptical. At several points when the luminous objects were being sort of tossed around the room, he was able to see some of what looked like her trickery. There were later reports that she was using a stick to ring the bell that was supposedly being rung by the spirits, and even weirder, the spirit hand turned out to be a piece of an animal liver.
Katy Milkman: J.B. Rhine saw right through this charade.
Linda Rodriguez McRobbie: Rhine eventually produced a report essentially declaring Mina Crandon to be a fraud. The report appeared in 1927, and in it he said, "If we can never know to a relative certainty that there was no trickery possible, no inconsistencies present, and no normal action occurring, we can never have a science and never really know anything about psychic phenomena."
Katy Milkman: Rhine wanted to use the scientific method to separate fact from fiction in matters of parapsychology, and that meant weeding out fraudulent claims.
Linda Rodriguez McRobbie: Rhine's declaration that Mina was a fraud definitely didn't make him popular with her supporters. However, it did establish him as a kind of authority in this area, as somebody who would be willing to investigate psychic phenomena with a real sort of sense of scientific rigor.
Rhine's first experiments with what he would later call extra-sensory perception were fairly informal. He would do things like try to get people to guess a number, or what playing card he was looking at, that sort of thing. But people tended to have favorite cards or numbers that they guess more often. He wanted to make sure that people wouldn't have a prior association that would push them one way or the other.
Katy Milkman: That's when Rhine decided to develop a more scientifically rigorous test. He approached a colleague at Duke University, psychologist Dr. Karl Zener, to create a set of cards that could be used to test what J.B. Rhine was calling ESP, or extra-sensory perception.
Linda Rodriguez McRobbie: The cards were five symbols: a star, wavy lines, a circle, a square, and a plus sign. Zener purposely chose these cards so that people wouldn't have a preference. They wouldn't always guess the star, or they wouldn't always guess the wavy lines. There was no … hopefully no chance of a prior association with them.
Katy Milkman: Fun fact, you can see Zener cards used in a scene from the 1984 movie Ghostbusters, where Bill Murray's character is testing ESP.
Linda Rodriguez McRobbie: Initially, the test basically involved Rhine holding up one of these cards, facing him, and asking the test subject to guess what he was looking at.
The Zener card test proved to be incredibly popular with the Duke undergrads. By 1931, Rhine conducted 10,000 Zener experiments with 63 participants. People were really excited about his work. They felt like they were onto something really big, really world altering.
Katy Milkman: J.B. Rhine remained skeptical, but there was a lot of enthusiasm around his lab as results started to come back, suggesting that some people actually could do better than chance when it came to predicting what was on the cards.
Linda Rodriguez McRobbie: His daughter recalls people literally jumping up and down with excitement over what they were finding, over what seemed to be verifiable proof that people were able to glean information in ways other than the norm.
Katy Milkman: One of Rhine's star subjects was a Duke University student named Hubert Pearce.
Linda Rodriguez McRobbie: Pearce seemed to have an ability beyond what he'd even seen in his other subjects who were consistently able to guess the cards. I believe he was typically scoring better than 40%, which was very good. So sometimes Pearce was able to score well above the 20% random chance, into the 40s, and other times he was well below. Some of that also depended on who was giving him the test. If it was Rhine giving him the test, it seems like he did a little bit better.
Katy Milkman: Here's the interesting thing: The Zener card test was actually simpler compared to using regular playing cards. In a deck of 52 cards, you start out with a one in 52 chance of guessing the right card. The chance of guessing the correct Zener card was one in five. If the volunteer knew all the symbols and could count cards, the test would get progressively easier as cards were eliminated.
Linda Rodriguez McRobbie: You know there are going to be 25 cards in the deck, and you know there are five symbols, which means there's an opportunity to see these cards five times. So it does get easier. Rhine did his best to make sure that there was no way that Hubert Pearce had access to the cards or could see the cards before the experiment began. So in one case, for example, he would have the research assistant shuffle the cards in a room 200 yards away, to make sure that there was no chance that Pearce would be able to have any sense of the order of the cards before he was asked to guess them.
Rhine was particularly interested in the people that he felt were exhibiting psychic, or psy, abilities, and so he would test them over and over again. So the people who did well tended to show up multiple times, like Hubert Pearce. People who didn't do as well tended to not come back.
Katy Milkman: Do you see what's happening here? Rhine would eliminate the volunteers who scored near or below chance from the study, and the volunteers who scored above the rate of chance would continue to be tested. The thing is, in any reasonably large sample, some people are going to score below chance. Some will score near what chance would predict, and some will likely score a little better than chance. And the volunteers who scored above chance moved on to the next round of testing. So it looks like these people have extra-sensory perception.
Now to add insult to injury, this test is a little shoddy. So once people were familiar with the test, because they'd taken it before, doing well was a bit easier. The methods for preventing card counting were far from foolproof. So this reinforces the idea that there's something special about people who had survived a largely random selection process.
J.B. Rhine ended up with a small list of volunteers who consistently scored above chance, and he took this as proof of the existence of ESP.
Linda Rodriguez McRobbie: The culmination of Rhine's experiments with the Zener cards was his 1934 book Extra-Sensory Perception. Among non-scientists, this book was hugely popular. By 1937, you could buy Zener cards at the five-and-dime store. People talked about him in the press all the time. His books were part of the book-of-the-month clubs. Most importantly, it also meant that wealthy people who were similarly inclined, and who believed in the research that he was doing were now giving him money to study psy phenomena, enough money so that he could set up his own lab at Duke.
People were incredibly interested in what was going on in his lab. Famous people like Jackie Gleason, Brave New World writer Aldous Huxley, Carl Jung—big names were really interested in what he was doing and the kind of work he was putting out. Professionally, however, his fellow scientists were a bit less enthusiastic about his results. Essentially people in the scientific establishment began trying to find the holes in his research and his methodology, and they found them.
Katy Milkman: In science, the ability to reproduce experiments is key. J.B. Rhine's results didn't seem to hold up to the scrutiny of other researchers.
Linda Rodriguez McRobbie: On the heels of Rhine's popularity, several other investigators tried to replicate what he'd done. There was a man at Princeton, W.S. Cox, who in 1936 tried to replicate Rhine's findings and was unable to. And there were several people after that who weren't able to, and this has been the sort of biggest problem in parapsychology since, is that people have not been able to consistently replicate the findings.
Part of the problems that people were seeing in Rhine's work were issues of methodology. There were concerns that, for example, they couldn't be sure that the subject hadn't actually seen the cards. In some cases, the cards were so poorly printed that you could actually see through them. When Rhine began to tighten his scientific controls, fewer people were able to score above random chance.
Katy Milkman: It was at this point that J.B. Rhine's research really began to fall out of favor in the scientific community, but his project continues to this day.
Linda Rodriguez McRobbie: By 1948, the lab isn't really associated with Duke University in quite the same way that it had been before. When Rhine retired in 1965, despite the fact that the lab was continually doing research in this area, the lab closed. He founded a non-profit research center—the Foundation for Research on the Nature of Man—that continued to be involved in ESP research well into the 1980s. Now the Rhine research center still exists in Durham and still uses Zener cards.
I think Rhine felt like he'd never been given quite a fair shake, especially towards the end of his life. Especially as the funding started to dry up, as the accolades started to dry up, I think he felt like he brought a lot to the study of psychical research that was ignored by the scientific establishment or derided by the scientific establishment. And I think that probably hurt him.
Just to be fair to him, because he wasn't deliberately falsifying his results, he wasn't making things up, he did want to apply the most rigorous scientific method to what he was studying. But he could also be blinded by what he wanted to believe.
Katy Milkman: Linda Rodriguez McRobbie is a freelance journalist and writer whose work has been in the Boston Globe, The Guardian, Smithsonian Magazine, and more. I've got a link to her Atlas Obscura article on J.B. Rhine in the show notes and at schwab.com/podcast.
The process by which J.B. Rhine narrowed his field of ESP candidates was problematic, and it's a type of reasoning that rears its head in other contexts. For instance, we wanted to test people's ability to identify successful and unsuccessful musical artists. Keep in mind, unsuccessful musicians should vastly outnumber successful ones.
Speaker 6: Can you please name five successful bands or musical artists?
Speaker 7: Queen, Shania Twain, Elvis, Johnny Cash.
Speaker 8: The Beatles, Rolling Stones, Led Zeppelin, Pink Floyd, and …
Speaker 9: The Killers, the Beatles, the Who, the Rolling Stones.
Speaker 10: The Beatles, the Who, Fleetwood Mac, ZZ Top, and the Eagles.
Speaker 6: You banged those out really quick. OK, now name five unsuccessful bands for me.
Speaker 7: Unsuccessful … I don't know too many. I don't know any off the top of my head.
Speaker 9: Like long term? Rebecca Black, she did that "Friday" song.
Speaker 10: How would I know them if they're unsuccessful? I can't think of any unsuccessful bands.
Speaker 8: I don't think I can. I can't think of any because they're so unsuccessful, they've dropped out of sight.
Katy Milkman: This makes sense, right? It was relatively easy for people to come up with successful artists, because successful artists become famous. We hear about them way more often, but what about all the overnight successes that didn't happen? They're invisible to us.
We see the same phenomenon in the world of business. There's several notable examples of entrepreneurs who dropped out of college to become great successes. We listed some of these examples and asked people whether dropping out of college seemed to be a useful strategy for success.
Speaker 6: You know, Steve Jobs, Mark Zuckerberg, Bill Gates, they're all massively successful people, and they all dropped out of college. So do you think that that indicates that college isn't necessary for that type of success?
Speaker 7: Yeah, I do actually. I think that's entrepreneurship. Those guys are all entrepreneurs, so you don't really need college for that kind of stuff because that's a creative mind.
Speaker 10: Yeah, college is not necessary to get that kind of success. Because I think that wealth or a really good job has as much to do with creativity than anything else.
Speaker 8: Oh, absolutely. I don't think that college as an institution, the way that we use, it can determine success as an entrepreneur or in any field that you have to create your own opportunity in.
Katy Milkman: The thing is, the data shows clearly that more education generally equals better career outcomes. Bill Gates and Steve Jobs are the rare exceptions to the rule. You don't hear about all the college dropouts who worked on new technologies or founded start-ups and failed, but they vastly outnumber the few success stories.
The business section of your local bookstore is filled with stories of successful entrepreneurs offering advice on how to make it big. What's missing is a section filled with the many people who made a go of it and failed for one reason or another. The problem is that it might be helpful to get advice from people who didn't succeed. They may have useful insights about what to avoid, or you might discover that they did the very same things as the people who had great outcomes. But those people whose start-ups flop, they don't get book deals. They're a casualty of something called survivorship bias.
Survivorship bias is a logical error. We tend to concentrate on success. We focus attention on the people or things that survive, the ones who make it through a selection process that is often at least somewhat random. We tend to overlook the people or things that failed, because their failure makes them invisible, and this can lead us to draw incorrect conclusions about the world.
J.B. Rhine became a victim of survivorship bias when he focused on the people in his pool of volunteers who guessed correctly more often than chance would predict. Those people who scored higher than chance survived his experiment, and he looked at them as proof that ESP existed, but every distribution has above average and below average performers. When J.B. Rhine ignored the larger number of volunteers who scored at or below chance, he was making a logical mistake.
A classic example of another type of survivorship bias comes from World War II. American bomber planes were returning to base riddled with bullet holes and damaged from flack. Military leaders wanted to know where they should reinforce their planes with armor to help more of their airmen survive the missions. The intuitive answer was to reinforce the areas of the planes where the most damage was being sustained, but statistician Abraham Wald took survivorship bias into account when making his recommendations. He realized that the planes returning to base were the ones that didn't crash. The damage they sustained wasn't catastrophic. They were the survivors.
Wald argued that the planes that didn't return should be of greatest concern. They were likely hit in places where the surviving planes were not. Therefore, the least damaged parts of the surviving planes became the focus for additional armor. Funny thing, the Wald story itself is a product of survivorship bias. It's one of the most referenced examples. Many other examples are ignored.
I've invited economist Sendhil Mullainathan of the University of Chicago Booth School of Business to talk about how survivorship bias affects decisions in all different settings, ranging from medicine to investing to hiring.
Katy Milkman: Hi, Sendhil. Thanks so much for coming on the show.
Sendhil Mullainathan: Hi, Katy.
Katy Milkman: OK, first ask, could you just define survivorship bias for our listeners?
Sendhil Mullainathan: I think what I would say is, it's an error that arises because we look at the data that we have but ignore some sort of selection process that led us to have the data that we have, and then treating it as if it's reflective of the underlying truth.
So "Oh, wow, yeah, everyone I hired and interviewed and liked turned out to be good. So I'm 100%, I've got a great hit rate of hiring." Well, we don't have the data on the people you didn't hire, so we're ignoring the process that generated the data you have.
Katy Milkman: What are some of your favorite examples of survivorship bias, Sendhil?
Sendhil Mullainathan: So one fun example is actually from the world of investing. Imagine that you got a letter—this is from the 1940s when people still had letters—so imagine you got a letter in the mail that says, "Hey, Katy, I have a new stock-picking trick. And since I know you don't trust me, I just want to tell you, tomorrow look at Acme (whatever) Incorporated, and I'm going to tell you, it's going to go up." Meh, you kind of look, you say, "Wow, I don't know, I don't believe this, whatever." But you look at it and it went up. But anyone can get lucky.
The next week you get another letter saying, "I told you Acme would go up. Tomorrow I want you to look at Johnson Incorporated, and I'm going to tell you it's going to go down." So now you're intrigued. You look at Johnson Incorporated, and in fact it does go down. Now you're waiting for the third letter. In fact, it does come. There is another pick. And in fact it's exactly right. On the fourth pick, the person is right, and they say, "If you're interested in having me as your advisor, now you should call me."
Do you see what all this is headed?
Katy Milkman: I do.
Sendhil Mullainathan: And this is actually a thing that was run in the '40s—I think it was '30s or '40s. What they did was, they obviously just sent a bunch of random guesses to 10,000 people. Half the times they are right, so then to those half people, they send another bunch of random guesses. Half the times they're right, to those people. So you start with the 10,000. After four guesses, you're down to … I guess two, four, eight, 16, still a pretty big population of the original 10,000 who now thinks you're amazing.
And what the population is suffering from is they're suffering from survivorship bias. They're in the set of people who happened to survive, and so now they have this entirely false belief driven by the fact of survival. But that principle applies in so many places.
I'll tell you the place where it really applies in spades. It applies to people like you and me. Anybody who's had a set of positive lucky events that lead them to be successful in life, they don't think of themselves as people who happen to get Acme Incorporated right and happen to get Johnson Incorporated right. They think of themselves as talented people. And if you look at people who are, for example, billionaires, no one says, "That's a person who won a lottery ticket." People are like, "I would love advice from that person." Well, maybe. I mean, surely they're slightly more talented than people who aren't billionaires. But there are a lot of people who were exactly in the same boat who just didn't happen to have lucky draws.
Katy Milkman: Do you have any words of wisdom for people who are now enlightened about survivorship bias and might want to dodge it?
Sendhil Mullainathan: Yeah, absolutely. I think that this is one of the easiest biases. I like biases where you can just have a trigger that makes you aware and alert: "Oh, wait, this might be happening" and then something you can do when it's triggered. Here, for me, the trigger is always someone gives me a statistic or a number or an inference. It could be, "I'm good at blank." It could be, "Wow, we have a high hit rate when we do this." Any sort of assessment. Then you say, "Great, what's the data that that comes from? … OK, your past successes." And now this is the key to de-biasing. What's the process that generated that data? What are all the other things that could have happened that might've led me to not measure it?
So in other words, if I say—the interview one is perfect—someone says, "I'm great at interviewing," you just say, "OK, well, what data are you basing that on?" "My hires." Great. What's the process that generated it? Hires and not hires. You're missing the not hires. So it's almost like a very simple thing, where you just need to ask the question, "What's the data that's not present?"
So if you ask, "What's the thing that led you to make the statements you make?" Fine. What's the other data that could be there in the denominator that we don't have?
Katy Milkman: Thank you for doing this. I so appreciate you joining me, Sendhil.
Sendhil Mullainathan: Well, this was fun, Katy. Thank you.
Katy Milkman: Sendhil Mullainathan is the Roman Family University Professor of Computation and Behavioral Science at the University of Chicago Booth School of Business. He's also the co-author of the book Scarcity: Why Having Too Little Means So Much. There's a link to the book in the show notes and at schwab.com/podcast.
I thought it would be interesting to look at this bias in a totally different setting—parenthood. There are all sorts of books on strategies for helping your kids sleep, stay healthy, and succeed in school, and many parents develop strong opinions about which approaches work best. But sometimes parents, myself included, can miss the bigger picture.
I've asked Emily Oster, an economist and expert on the science of parenting, to join me for today's show. Emily agreed to talk about how survivorship bias can affect the way we think about raising our kids. Emily's a professor of economics at Brown University and the best-selling author of the book Cribsheet, which demystifies scientific research on parenting for a general audience.
Emily, thank you so much for joining me.
Emily Oster: Thank you for having me.
Katy Milkman: OK, so let's start by talking a little bit about survivorship bias. Given the wonderful books you've written about pregnancy and the early years of child rearing, and that you cut through a lot of these fallacies, I thought you might have interesting examples from that arena. I was wondering if you have encountered survivorship bias when you've read any of the literature in those spaces.
Emily Oster: Yeah, I mean I think there are tons of examples of this in the parenting space. One that I sometimes think about is antibiotic usage. It used to be very common to prescribe kids antibiotics for everything. You know, if you came in with a goal they were like, "Yeah, here's some penicillin, just take it." And of course there's a survivorship bias in the kind of outcome evaluation there, because you see that the kids get better and you kind of only see the kids who you treated who came in and you treated them and they basically—they all get better because everybody gets better when they have a cold.
But what you don't see is the people who didn't appear in your sample, and those people also got better. So we're sort of attributing causality to the effects of these antibiotics because we're only seeing this sort of selected survivor sample.
Katy Milkman: The way you described it sounds a little bit innocuous. Like, "Oh, who cares." We attribute causality to medications and we give them out too much. Do you feel like this is a harmful bias? Like, can you think of important examples where it really does matter? Antibiotics may be one, actually.
Emily Oster: Yeah, so antibiotics is one where it actually really does matter, because overuse of antibiotics is a huge problem, both for individuals but also more importantly for a population. So there's actually a lot of push against this kind of use of antibiotics in policy. But it's hard. Because of this bias, I think it's proved to be quite intractable because doctors, particularly doctors who have worked for a long time, they've seen, like, "OK, well I give people antibiotics, and they get better, so I'm going to keep doing that because that seems like it's working," without kind of thinking about the other piece of this that you don't see. So even though from a policy standpoint it would be really good if we prescribe fewer of these, we have not seen that change.
Katy Milkman: What advice would you have for our listeners about situations in their lives where they might be victims of survivorship bias and might want to change their thinking? Are there natural settings where you think in parenting we make biased decisions that we should try to avoid?
Emily Oster: This may be a more general point, but I think for this reason and a bunch of other reasons, we are very likely to attribute causality to things that happen because we are sort of looking for causal links. I always talk about this in the context of early parenting when you are trying to figure out why your baby is doing things. This time they slept a lot, so let me back-engineer, what are the things that I did before that one incident where they slept a lot? I opened the window like one inch but not two inches. I fed them for 13 minutes on one side, and I used the pink swaddle. So like, maybe if I do all those things again, that will work again.
I mean, I think whether you exactly refer to that as survivorship bias, I think there's a piece that falls in that category because the survivor aspect there is that you've survived the seven hours with sleeping. And of course making those causal leaps is not … that is not why your kid slept; it's just that sometimes kids sleep more. So I think we can sometimes get a little bit too anxious in early parenting around trying to figure out what is going on and not being willing to just accept this is something over which you don't actually have that much control, and you probably shouldn't really try to think about causality that much. Like, sometimes your baby just does random stuff for no reason.
Katy Milkman: If you had to say what some of the biggest mistakes are that you feel like people make—when they're using that sample of one instead of actually stepping back and using the right statistical tools to make inferences—what do you think those are?
Emily Oster: It is easy to infer that you have done the right thing in terms of, like, how do you get your kid to sleep through the night? Whether you use this technique or this other technique, or to sort of get very wedded to the idea that one of these techniques—like leaving them to cry, not leaving them to cry, coming in every three minutes, coming in every seven minutes—that one of those techniques is right. You know, the truth is that when you look into the data, basically any kind of cry-it-out technique—coming in, not coming in, staying in the room, leaving the room—anything where you have a consistent approach and you do it, any of those is going to work pretty well.
The problem is that, number one, when you get to your second kid, you may inappropriately think that something is going to work that isn't so special. But in the other place that this comes up is when we see that something works, we often then want to explain to other people that that is the only thing that works. And I think that is one of the central messages of the book is that learning what works from your kid and then using that to boss other people is not super productive, because actually there's many good ways to parent.
Katy Milkman: So I feel comfortable also calling that survivorship bias, because it feels like, you know, like you said, you sort of survived, and this person survived.
Emily Oster: You survived. That is how parenting feels, yes, "I've survived this."
Katy Milkman: "This worked!" Like, you observe only the success. You survived, the kid survived, and it worked. You know, you celebrate and assume this is the one and only way.
Emily Oster: Yeah, and I mean I think a more direct comparison is to look at kids who are successful and to then look back and say somehow that is the only way to be successful. You know, this kid, their mom breastfed for three years and did this parenting strategy and this parenting strategy, and then I'm going to infer from that that those are the right strategies. I mean that's like sort of more true survivorship bias, where you've seen one or two examples of success and you're trying to back-engineer what worked there, as opposed to saying, "Yeah, a lot of people got that treatment, and only a few of them ended up successful."
Katy Milkman: At least it's reassuring to hear that there's no one right way to be a parent. Emily, thank you so much. I really appreciate you taking the time to talk with me.
Emily Oster: Thank you. It was such a pleasure.
Katy Milkman: Emily Oster is a professor of economics at Brown University. Her most recent book is called Cribsheet: A Data-Driven Guide to Better, More Relaxed Parenting From Birth to Preschool. I've got a link in the show notes and at schwab.com/podcast.
Survivorship bias can affect how we judge investment opportunities too. Funds close, companies go out of business or merge, the stocks that make up an index change over time, and if we only evaluate investment strategies using those companies and fund managers who've been successful and we don't look at companies and funds that have failed or underperformed, we're not getting a complete picture. That's why our sister podcast, Financial Decoder, digs into issues like this—so that you can make more informed financial decisions. You can find it at schwab.com/financialdecoder or wherever you listen to podcasts.
Survivorship bias is tricky, because the things that don't make it through the selection process, the failed gladiators or the downed bombers, the unsuccessful entrepreneurs, they become invisible. Combating survivorship bias is a challenge because you're looking for things that aren't obvious. In seeking career advice or business advice, you may want to look beyond the voices and signals that happen to rise above the noise. People who succeed in business may attribute some aspect of their success to a particular workout regimen or their habit of wearing the same outfit every day or to a particular diet or meditation practice. But what you don't hear about is all the other people who followed that same workout or habit or diet or practice but didn't have the same successful outcome.
Try to seek out those stories of failure as well. They're far more important than we intuitively realize. They are, in fact, the key to overcoming survivorship bias. In short, survivorship bias means we miss out on learning from people or products or ideas that didn't succeed. Some of the most important information on the road to success often comes from a balanced evaluation of what really differentiates the good outcomes from the bad.
You've been listening to Choiceology, an original podcast from Charles Schwab. If you've enjoyed the show, leave us a review on Apple Podcasts. You can subscribe to the show for free in your favorite podcasting apps. That way you won't miss an episode. I'm Katy Milkman. Talk to you next time.
Speaker 13: For important disclosures, see the show notes or visit schwab.com/podcast.