Transcript of the podcast:
Peter Bergen: It was in the middle of the night in a mud hut in Afghanistan, which was then controlled by the Taliban. It was March of '97, so it was cold.
Katy Milkman: This is Peter.
Peter Bergen: I'm Peter Bergen.
Katy Milkman: Peter is an author and journalist, best known as a national security analyst on CNN. In March of 1997, Peter sat down for an interview with a man who would forever alter the course of American history.
Peter Bergen: He appeared out of the darkness and sat down and did the interview and basically declared war on the United States.
Katy Milkman: Peter had his suspicions that Osama bin Laden had planned the 1993 bombing of the World Trade Center in New York. By September 11th of 2001, bin Laden's involvement was no longer in question.
Speaker 3: Oh, my God. Another plane has just hit another building.
Speaker 4: You are looking at live pictures of what appears to be an attack on the Pentagon.
Speaker 5: Osama bin Laden, according to this newspaper editor, warned three weeks ago that he would attack American interests and …
Speaker 6: All the indicators do seem to point to Osama bin Laden being responsible for this attack.
George W. Bush: Osama bin Laden and other terrorists are still in hiding. Our message to them is clear, no matter how long it takes, America will find you, and we will bring you to justice.
Katy Milkman: After 9/11, a date that now haunts every American, bin Laden became the most wanted man in the world, and for good reason. But he proved incredibly elusive. The full force of U.S. military and intelligence was deployed in the hunt for this terrorist leader, and he was tracked to the mountain caves of Tora Bora in Afghanistan, but somehow he managed to escape; and then the trail went cold.
In this episode, we'll share the story of an incredibly difficult prediction made during the hunt for bin Laden and some key lessons from behavioral science about forecasting that you can apply to your own decisions.
I'm Dr. Katy Milkman, and this is Choiceology, an original podcast from Charles Schwab. It's a show about the psychology and economics behind our decisions. We bring you true stories of high-stakes decisions, and then we examine how these stories connect to the latest research in behavioral science. We do it all to help you make better judgements and avoid costly mistakes.
Peter Bergen: The CIA was a lead on finding bin Laden. They realized by 2005 there was no single person who was going to lead them to bin Laden who was in American detention or in detention of a foreign country. They got a lot of false leads—bin Laden is in Rio de Janeiro or bin Laden is here—and all of these were nonsense as it turns out, but all of them had to be chased down.
Katy Milkman: After four years of searching, the top U.S. intelligence agency was no closer to apprehending the mastermind behind the September 11th attacks than when they'd started. The CIA would have to change its approach.
Peter Bergen: They realized they're not going to find bin Laden directly. They're going to have to find him with a bank shot, to use a pool term.
Katy Milkman: That bank shot was to focus on the courier network that relayed messages to and from bin Laden.
Peter Bergen: The intelligence world is dealing with imperfect information, fragmentary information, you're trying to put things together.
Katy Milkman: One fragment of information led to an Al-Qaeda operative who had recently rejoined the network as a driver in the Pakistani city of Peshawar. Intelligence agents picked up that he was driving a white Jeep.
Peter Bergen: They followed the Jeep to the city of Abbottabad, which is a very sleepy city in Pakistan, and he drives into a compound with one large house, one small house, and set on about an acre of property. That compound was surrounded by 20-foot walls, and the inhabitants are burning trash. There're not connected to the internet. They don't have a phone system. Kind of a strange thing for somebody that clearly has some money.
At that point, the people who are looking for bin Laden brief the director of the CIA, which is Leon Panetta.
Leon Panetta: I'm Leon Panetta, former director of the CIA. I had established a task force at the CIA to basically focus on whatever evidence might tell us where bin Laden was located.
Katy Milkman: The evidence that bin Laden might be in Abbottabad was thin. But there were several clues that something was quite suspicious about this particular compound.
Leon Panetta: We identified a family that was living on the third floor that never came out, always stayed within the compound. And we soon found out that in order to make a phone call, the couriers would go 90 miles away from this compound. So they were exercising very high security. So that's what gave us the best evidence that perhaps bin Laden might be located there.
Katy Milkman: Director Panetta briefed the president on this lead in August of 2010.
Leon Panetta: When I reported what we had found to the president, he said obviously what we needed to do was to conduct surveillance on the compound to try to determine if bin Laden was actually there.
Katy Milkman: But surveillance was difficult for a number of reasons, including the high walls around the compound. The only views they had were from satellites, and those images were largely inconclusive. But there was one tantalizing piece of visual evidence.
Leon Panetta: There was one moment when we were able to see somebody who was a little older come out into the yard of that compound and walk in circles. This happened usually each day, would come out, walk in circles, and go back into the compound like a prisoner in a prison yard.
Peter Bergen: The pacer would pace around this garden, but he was wearing a cowboy hat and he was pacing underneath trellises. There was no overhead imagery that showed that this was bin Laden.
Leon Panetta: So at that point I said to the CIA team, "That could be bin Laden. Can you get me a telescope or get me a camera or something close to see if we could get a facial ID on that individual?"
And they said it was very difficult to do because of the walls on all sides. I remember saying at the time, "I've seen movies where the CIA could do this," and we laughed about that. But we were still unable to really get a facial ID.
Peter Bergen: At one point, Leon Panetta said, "Well, let's measure his shadow since we know he's six foot four." And the analyst from the National Geospatial Agency came back and said, "Well, he's between five foot four and six foot eight." Well, it's not a particularly useful piece of information.
Katy Milkman: So there was no photo confirming their suspicions, and there were plausible alternative explanations for the behavior of the people in the compound.
Peter Bergen: This was a drug trafficker. This was somebody in Al Qaeda who had retired. This was somebody who just wanted to keep a low profile, so there were other explanations. Nobody could say for sure that he was there. It was an entirely circumstantial case.
Katy Milkman: Despite the lack of concrete evidence of bin Laden's location, the compound in Pakistan was the best lead they had. By this point, it had been nearly 10 years since 9/11. The pressure was on to take action. But the CIA was still chastened by intelligence failures in Iraq.
Peter Bergen: The CIA had taken a huge hit off the weapons of mass destruction fiasco in Iraq, which was built on a circumstantial case that Saddam had weapons of mass destruction. Well, turned out to be 100% wrong.
Katy Milkman: Intelligence and military leaders faced a high-stakes decision in the face of inconclusive evidence.
Peter Bergen: At a certain point, there's no more information to provide. It becomes a political question about what to do with the information. There was, first of all, elaborate discussions of what was the evidence of if bin Laden was there.
Leon Panetta: At one point I asked the key people at the CIA to come to my office, and I had my chief of staff there as well as my deputy, and I just basically asked all of them, "What do you think? You know what the evidence is. You know what we've been looking at. What is your best advice here as to whether or not we should conduct a mission?"
Peter Bergen: But then it became more of a, "Well, OK. We don't know if he's there, but we should maybe have a plan to do something about it," and there were a bunch of ideas on the table.
Leon Panetta: I called Admiral Bill McRaven, who was head of special forces. By that time we had done a model of the compound, showed him that model. I asked him to come up with several options. He did.
Peter Bergen: B-52 bombing raid, launch an experimental drone, gather more intelligence, do a joint operation with the Pakistanis.
Katy Milkman: All of these proposals were fraught. Bombs might cause collateral damage and destroy evidence of success. The experimental drone had never been used in this type of operation and might fail. A joint operation with Pakistan might risk tipping off the target.
Peter Bergen: A bunch of things that could go wrong, there could be dangers if SEALs did a ground operation. People could be killed, captured. Bin Laden might not be there. We'd completely blow up our relations with the Pakistanis.
Katy Milkman: The discussion was becoming one around whether or not to proceed at all. The decision hinged on everyone's level of confidence that bin Laden was actually at the compound.
Leon Panetta: Interestingly enough, it varied. Some thought 60% that we should go ahead. Some were below 50%, thought it was too risky of an operation. And there were a couple people who felt very strongly that bin Laden was there. I remember one analyst saying, "I think it's 90% that bin Laden is there."
So it varied a great deal, but they all gave their opinions. And based on their input, I ultimately had to make up my own mind as to whether or not I would recommend to the president whether we should proceed with the mission.
Katy Milkman: In addition to considering the varied opinions, there was also a concerted effort to interrogate important intelligence to rule out alternative explanations.
Leon Panetta: We did some red team stuff where we looked at other possibilities.
Peter Bergen: After the Iraqi weapons of mass destruction fiasco that the CIA was deeply involved in, in 2003, they set up these red teams to kick the tires of any circumstantial case or any big kind of intelligence question. They would always red team it and say, "What are the alternative explanations for the same set of facts that we have?" so that we're not just all drinking our own bath water and all agreeing that we're right.
Katy Milkman: With potential alternative explanations explored and the limited evidence thoroughly analyzed, it came time to make a decision.
Leon Panetta: Ultimately, it came down to a meeting of the National Security Council to determine whether or not we would go ahead with that operation.
Peter Bergen: The last meeting was on April 28th, 2011.
Leon Panetta: A number of key people at the National Security Council meeting, the vice president, secretary of state, secretary of defense, head of the CIA, director of national intelligence, chairman of the Joint Chiefs of Staff, and a number of others.
Peter Bergen: These meetings were conducted in a very high level of secrecy. It's a big, relatively big corporate-style conference room. There are clocks on the walls. So Kabul, Afghanistan; Baghdad, Iraq; Washington D.C.
Leon Panetta: And the president went around the room.
Katy Milkman: Like the CIA analysts, members of the National Security Council had very different levels of confidence around what should be done.
Leon Panetta: The vice president thought we needed to gather more intelligence and thought we needed to have a little more time to be able to verify if it was bin Laden. The secretary of state recommended that we should go. Bob Gates, who was secretary of defense, was concerned, and there were others that had just shared the same concern. They thought back to when the helicopters went down during the Carter administration going after people who'd been locked up from the embassy there.
Jimmy Carter: Late yesterday, I canceled a carefully planned operation. Equipment failure in the rescue helicopters made it necessary to end the mission. Two of our American aircraft collided on the ground.
Katy Milkman: Operation Eagle Claw was a failed attempt to rescue 52 people held captive at the U.S. Embassy in Tehran on April 24th, 1980. Eight U.S. servicemen were killed. A helicopter and a transport aircraft were destroyed. It was the type of scenario that the National Security Council desperately wanted to avoid. Despite that history, the president was increasingly leaning towards a ground operation.
Peter Bergen: There's some important reasons for that. One, you could prove to yourself that you had actually captured or killed bin Laden. If you do … drop a big bomb, you can't. If you do the drone strike, you can't. Two, you might gather a lot of intelligence from the site.
Leon Panetta: The chairman of the Joint Chiefs, the director of national intelligence, myself, obviously all recommended that the president proceed with the mission. The national security advisor made the same recommendation.
Peter Bergen: President Obama went around the table, he listened, he just wanted to hear what everybody had to say. It wasn't a vote. He was asking for advice.
Leon Panetta: I said, "Mr. President, when I was in Congress and I faced a tough decision, I would pretend that I was talking to an average citizen in my district and saying if you knew what I knew, what would you do?" And I said, "If I told the average citizen in my district that we had the best evidence on the location of bin Laden since Tora Bora, I think that citizen would say we have to go. And that's what I'm recommending to you."
So in the end, to the president's credit, he let everybody say their piece, and then, as always, it comes down to the decision by the president. The president took it all in and spent that evening thinking what his decision would be.
Peter Bergen: President Obama went to his private office in his residence at the White House and sat up pretty late and thought about the victims of 9/11. Thought about what the United States knew about this situation. He came down the following morning at 8:30 a.m., met with his national security team.
Leon Panetta: I got a call from the national security advisor, who told me that the president had decided to go with the mission. It was a gutsy decision, because there's no question it was risky, and there was no question that we did not have 100% confirmation that bin Laden was there.
If something happened, if the helicopters went down, if bin Laden was not there, we would have to accept responsibility, and there would be repercussions as a result of that.
Katy Milkman: A decision was made. A team of Navy SEALs, SEAL Team Six, would fly by Black Hawk helicopter, under cover of darkness, about 120 miles into Pakistan from Afghanistan to raid the compound in Abbottabad.
Leon Panetta: The directions I gave to Bill McRaven, I said, "I want you to go in, get bin Laden, and get out of there. And if you go in and you can't find bin Laden, get out." That was the direction.
So a lot of risks, a lot of risks. But at the same time, probably the most important covert mission we had ever undertaken.
I went to Mass on Sunday morning and prayed a lot. When I got to CIA headquarters, and we were running the operation out of the CIA, I really felt pretty confident that we were doing the right thing.
Katy Milkman: On May 2nd of 2011, SEAL Team Six took off from their base in Afghanistan at approximately 10:30 p.m. local time. Operation Neptune Spear had begun.
Leon Panetta: When you're going through the operation, and when we were following the SEALs into Pakistan, you're very consumed by the moment and wanting to make sure that the mission is going right.
But I have to tell you, when one of the helicopters got above the compound, because it was hot that day, the air came up and stalled the engine, and the warrant officer who was in charge of that helicopter had to settle it down with a tail up on one wall. And it's one of those moments where your stomach is in your mouth because you know some of the worst concerns about what could happen.
And I remember asking Bill McRaven, "What's going on?" I think I said, "What the hell's going on?" And he didn't miss a beat. He said, "Helicopter's down, but everybody's OK. They're going to continue with the mission. They're going to breach through the walls, and they're going to continue to go after bin Laden." And I remember saying to him, "God bless you. Let's do it." And he did. Again, another very important decision that had to be made in order to complete the mission.
We heard gunfire, which meant that they incurred some resistance. They then turned to go into the compound itself and go after bin Laden. And it was about 20 minutes of silence, the longest 20 minutes of my life, where you really didn't know what was happening.
And it was at the end of that 20 minutes that Bill McRaven came back on and said, "I think we have Geronimo," which was the code word for finding bin Laden. And it was at that moment where all of us in headquarters embraced one another knowing that all of the information, all of the intelligence, all of the work on the mission had proven right.
The trip back was another hour and a half, but they finally, finally landed, and that was the moment when we really were relieved that the mission had been successful.
Barrack Obama: Tonight, I can report to the American people and to the world that the United States has conducted an operation that killed Osama bin Laden, the leader of Al-Qaeda, and a terrorist who's responsible for the murder of thousands of innocent men, women, and children.
Leon Panetta: And it was a real source of pride that we had accomplished something special and that the world was safer as a result of what we had done. I also thought about the victims of 9/11, those who had died and their families, and thought we were finally delivering justice to those families who lost loved ones.
I often say that leadership requires that you have to take risks. If leaders were always facing decisions that were 100%, it would be an easy job. But there are many decisions that involve questions as to whether it's the right or the wrong step. And you've got to decide whether you think there's enough evidence to justify that kind of operation and weigh the risks that are involved. That's why we elect presidents, is to ultimately make that kind of decision. And the president, to his credit, not only made a gutsy decision, he made the right decision.
Katy Milkman: Leon Panetta is the former director of the CIA. He has also been the secretary of defense, the chief of staff to the president, and a member of Congress. More recently, he's the author of Worthy Fights: A Memoir of Leadership in War and Peace.
Peter Bergen is a journalist and producer who is CNN's national security analyst. He's the author of several books, including Manhunt: The Ten-Year Search for Bin Laden From 9/11 to Abbottabad. His most recent book is called The Rise and Fall of Osama bin Laden. You can find links in the show notes and at schwab.com/podcast.
We all grapple with a degree of uncertainty when we make predictions, like how long it will take to complete a project, or what will happen with the stock market this year, or what will the weather be like on Thursday?
In the case of intelligence analyst predictions about whether or not bin Laden was present at the compound in Pakistan, you could argue that they just got lucky, and it's very easy to see how the raid could have turned out quite differently. But the officials involved in the decision to proceed employed a number of strategies and techniques that improved their chances of success.
They worked in groups comprised of highly experienced, enormously qualified analysts. They used red teaming, essentially playing devil's advocate, to interrogate ideas and find errors in their thinking. Conflicting opinions were encouraged and considered. And the range of probabilities from different intelligence analysts were articulated and combined, increasing the likelihood of an accurate prediction.
My next guest has done incredibly important research in the field of forecasting, and she co-designed an elaborate multi-year study involving tournaments where participants were randomly assigned to face different forecasting conditions and then have their forecasting abilities put to the test.
The study was done to determine what conditions lead to the best forecasts. Barbara Mellers is my colleague at the University of Pennsylvania, and she is the I. George Heyman University Professor of both marketing at the Wharton School and of psychology at the School of Arts and Sciences.
Hi, Barb. Thank you so much for taking the time to talk to me today.
Barbara Mellers: Hi, Katie. It's great to be here.
Katy Milkman: I am really excited to get into some of your research on psychological strategies for becoming a better forecaster. And I wanted to start by asking you to define a few strategies that your work has proven can add huge value before talking about the research where you tested the impact of each of these techniques. So first, could I ask you to talk about probability training? What is that exactly?
Barbara Mellers: Probability training. We created an interactive module for people that lasted about 45 minutes, and it essentially gave them a lot of tips. We didn't teach them much in the way of mathematics, but we told them about how to look for professional forecasts on the internet, if they find multiple forecasts, average them. There are a handful of cognitive biases that are important when it comes to forecasting, and those include overconfidence and base rate neglect and probably the confirmation bias.
Katy Milkman: We've done previous shows about two of those topics.
Barbara Mellers: OK. All right.
Katy Milkman: Some of our listeners will be familiar with them.
Barbara Mellers: Great. Forecasters who were assigned to the condition with probability training had to do that training module prior to making any forecasts.
Katy Milkman: OK. So could you also tell us a bit about teaming? What does that look like?
Barbara Mellers: Teaming was an intervention in which we placed people together in virtual teams that consisted of, say, 15 people or fewer. And they worked together on a website, but they weren't in the same room.
Katy Milkman: Great. And then what is tracking, and what does it mean in the context of trying to achieve great forecasting results?
Barbara Mellers: Well, tracking is well known in the educational context where we talk about putting kids with similar abilities in the same classrooms. And that's all good if the kids have high abilities; it's controversial when they don't.
We decided to put forecasters together, those who had the top 2% in terms of forecasting accuracy. We placed them into teams of 10 or 15 people and allowed them to work together because we had already learned that teaming works. Tracking means being elitist about who works together and allowing a very enriched environment for those who do extremely well.
Katy Milkman: That's really interesting. OK. Thank you for defining those different techniques, and I would love it if you could talk a bit now about your research evaluating the effects of these different techniques on people's forecasting ability. Could you describe how you put together this really incredible tournament and how you assessed the benefits of probability training, teaming, and tracking?
Barbara Mellers: We were one of five research groups that were funded by IARPA, which is the Intelligence Advanced Research Projects Activity, sort of the little sister to DARPA, the research funding area. And we had a variety of people who worked with us, computer scientists, economists, political scientists, psychologists, and you name it, statisticians. And when we read the literature about how to improve human forecasting, we had no idea what the best method would be.
So our group decided to run experiments, which is the thing we do that comes naturally. And we recruited a lot of people and then randomly assigned them to conditions in which they either got a training module on probabilistic reasoning or did not get one. And we did that with teaming as well. We had people who worked independently, we had people who worked in groups, and then eventually after the first year of the tournament, we decided to take this elitist approach and put the best forecasters together, and that was not randomly assigned. We took the top 2% from each of our conditions and put them together in teams and allowed them to work with each other.
Katy Milkman: Could you talk a little bit about what the forecasting activities were that you had people doing over this sort of two-year experiment?
Barbara Mellers: Actually, it was four years. This paper just talks about the first two years. So we replicated things four times, believe it or not. Some of the questions—I was just looking back at them—one of them was about whether Berlusconi would be elected by a particular date. Would he be the leader of Italy? Would Putin be elected in Russia in 2012? Believe it or not, there was actually uncertainty about that at the time among a number of people. Would there be an international incident in the South China Sea?
Katy Milkman: There were basically all …
Barbara Mellers: Geopolitical.
Katy Milkman: Geopolitical events, and were people doing this weekly or monthly, and were they doing dozens or hundreds of forecasts?
Barbara Mellers: Well, each year there were roughly 100 plus questions that we launched, and we launched a few of them each week. They were questions of interest to the intelligence community, economic questions, "What will the price of Brent crude be?" "Will a particular UN decision go through?"
So they were all over the map, and we gave people feedback after each question was resolved. People jumped on the website, made forecasts about as many questions as they wanted to throughout the course of nine months. And these nine-month tournaments occurred during four years, so we could fine tune what we'd learned the previous year and test it further and build on it and so forth.
Katy Milkman: I love this experiment and I really appreciate you describing it. It's so creative and important. I would love it if you could talk a little bit about the key things that you found worked and why you think they added so much value to forecasting quality.
Barbara Mellers: It is effective to give people simple training modules that don't require too much in the way of mathematical knowledge or statistical knowledge. They benefit from tips. It's also useful in this kind of situation to allow people to work in teams where they can share knowledge, point each other to different articles, motivate each other when someone's not talking, and correct each other's errors. And that was more beneficial than working alone.
Now, the rationale for working alone is statistical. You can obtain independent forecasts and average them, and then the idea is that the errors will average out and you'll get a better estimate of the true score or the best forecast. And allowing people to talk and work together proved to be far more beneficial than allowing them to have independent errors.
Katy Milkman: It's really interesting, and it does contradict a lot of things we thought we knew about the importance of independent evaluation, and of course you have to worry about things like groupthink, people becoming echo chambers …
Barbara Mellers: Herding.
Katy Milkman: Yeah.
Barbara Mellers: Yes, social loafing.
Katy Milkman: But it's really interesting that here you find this benefit from teaming that's so clear. And what about tracking?
Barbara Mellers: Tracking was, most surprisingly, it was the best intervention by far. It was sort of like putting forecasters on steroids. Once they worked together, they were really interested in helping each other and providing articles and knowledge and tracking down esoteric kinds of information on the internet. And then they debated. And we monitored what … their interactions, and they were much more likely to ask each other questions than regular teams. They were much more likely to answer questions that people had asked, and they were much more likely to express gratitude for the interaction and sharing and saying thank you and so forth.
So they actually became really good friends, and there are actually, believe it or not, 10 years later, there are still groups of superforecasters getting together each month to talk about political affairs and so forth in San Francisco and New York and Washington D.C., in Los Angeles and so forth.
Katy Milkman: That is so interesting. And so the teaming intervention is great, but when you team up these top performers, that's when absolutely incredible things started to happen.
Barbara Mellers: Yeah, right.
Katy Milkman: How do you apply what you've learned about superforecasting in your life?
Barbara Mellers: Ah, yes. I am very aware of how poorly I make forecasts. I am bad when it comes to planning. How much time will something take? Planning fallacy, of course. I am bad at predicting how much I'm going to like a movie or a book or whatever, and I have learned to change my mind. I've learned to say, have a three-episode rule for a TV series. Look at it three times and then reject it.
I think just very simple lessons have pervaded my life about the importance of base rates and taking the outside view and not being overwhelmed with all of the details of, say, the Ukrainian war or Chinese interventions in Taiwan or whatever it is.
Katy Milkman: Are there important mistakes that you think people ever make when they're trying to apply the key findings from your research to make better forecasts in their own lives or in their organizations? What do people get wrong?
Barbara Mellers: Well, I think that there's two things that people don't do that they should be doing much more often than what occurs. And one is to actually make forecasts. We don't really put ourselves out on the line and make a numerical prediction very often. We say, "Oh, it's likely, or it's unlikely," and that vague verbiage lets us get out of lots of tricky situations that may not end the way we want them to.
Second, we need to keep records so that we can start to learn how good we are at forecasting, and where we're bad, where we could learn something, where we're pretty solid.
Katy Milkman: If you could leave our listeners with a key suggestion about what they might want to do differently in their own lives, especially now that they know about your findings on probability training and teaming and tracking, what suggestion would you offer them?
Barbara Mellers: I'd say the best way to become a better forecaster is to practice, and there's no better way to do that than to write down forecasts. But force yourself to predictions of events that are resolvable, that you don't need a lawyer to determine what happened. And then start seeing where your skills are, where you might need help.
In businesses there's a company spun out from this particular intelligence forecasting tournament. It's called Good Judgment Inc., and it does consulting and helps financial institutions with forecasts and so forth. So there are places that help people become better forecasters as well. But just testing yourself, making forecasts, keeping track, getting records, and seeing how well you do.
Katy Milkman: So you can learn from the good and the bad.
Barbara Mellers: And the ugly. Yes. Right.
Katy Milkman: Yes. Barb, thank you so much for taking the time to do this. I really appreciate it, and I learned so much.
Barbara Mellers: Thank you, Katy. It was fun.
Katy Milkman: Barbara Mellers is the I. George Heyman University Professor at the University of Pennsylvania, where I also work. Barb is jointly appointed as a professor of marketing at the Wharton School and as a professor of psychology at the School of Arts and Sciences. You can find links to her research on forecasting in the show notes and at schwab.com/podcast.
Forecasting is an unusual activity because the uncertainty inherent in it means that even the world's best forecasters are often wrong. Barb's work with her partner and Penn faculty colleague Phil Tetlock has revolutionized our understanding of what it takes to generate what they call superforecasters.
Simple probability training can add tremendous value. A key ingredient in that training is to encourage people to look at relevant reference points when making estimates, like looking at the price of other houses that have recently sold in your neighborhood if you're forecasting the sale price of your own home.
Another key ingredient is to teach people to first search for and then average independent forecasts made by others. Like, the price estimates offered by several different Realtors who have all expressed an interest in listing your house. These are simple steps we can all take to make better forecasts.
When you're forecasting how long it will take to refinish your basement, your forecast shouldn't be made in a vacuum. First, collect relevant data points. How long did it take other people to do similar projects? Maybe your project is quite unique, but those reference points will still be useful. You can collect multiple independent forecasts from people with decent knowledge of the building business too and average them together.
Do you need to do this kind of research to make every forecast in your life, like whether it will rain tomorrow or what TV show you'll like most in the year ahead? That probably wouldn't be a great use of time, but for big decisions, it's a great idea.
You might also pick up a copy of Superforecasting: The Art and Science of Prediction, a terrific book on this topic that was co-authored by Barb's partner and collaborator Phil Tetlock and the journalist Dan Gardner.
You've been listening to Choiceology, an original podcast from Charles Schwab. If you've enjoyed the show, we'd be really grateful if you'd leave us a review on Apple Podcasts, a rating on Spotify, or feedback wherever you listen.
You can also follow us for free in your favorite podcasting app. And if you want more of the kinds of insights we bring you on Choiceology about how to improve your decisions, you can order my book, How to Change, or sign up for my monthly newsletter, Milkman Delivers, at katymilkman.com/newsletter.
That's it for this season, but we'll have new episodes for you in late summer. In the meantime, check out the growing catalog of Choiceology episodes in our archive, or follow some of Schwab's other great podcasts, including Financial Decoder and WashingtonWise. I'm Dr. Katy Milkman. Talk to you soon.
Speaker 12: For important disclosures, see the show notes, or visit schwab.com/podcast.
After you listen
- Our sister podcast, Financial Decoder, takes an in-depth look at important financial decisions and how to guard against the cognitive and emotional biases that might affect them.
- Our sister podcast, Financial Decoder, takes an in-depth look at important financial decisions and how to guard against the cognitive and emotional biases that might affect them.
" role="dialog" aria-label="
- Our sister podcast, Financial Decoder, takes an in-depth look at important financial decisions and how to guard against the cognitive and emotional biases that might affect them.
" id="body_disclosure--media_disclosure--140726" >
- Our sister podcast, Financial Decoder, takes an in-depth look at important financial decisions and how to guard against the cognitive and emotional biases that might affect them.
There are moments in life where it seems as though everything is riding on one important decision. If only we had a crystal ball to see the future, we could make those decisions with greater confidence. Fortune-telling aside, there are actually methods to improve our predictions—and our decisions.
In this episode of Choiceology with Katy Milkman, we look at what makes some people “superforecasters.”
In 2010, the United States government had been looking for Al Qaeda leader and perpetrator of the 9/11 attacks, Osama bin Laden, for nearly a decade. Years of intelligence gathering all over the world had come up short. It seemed every new tip was a dead end. But one small group of CIA analysts uncovered a tantalizing clue that led them to a compound in Pakistan. Soon, the president of the United States would be faced with a difficult choice: to approve the top-secret mission or not.
We will hear this story from two perspectives.
Peter Bergen is a national security commentator and author of the book The Rise and Fall of Osama bin Laden. He interviewed Osama bin Laden in 1997.
Former CIA director Leon Panetta led the United States government’s hunt for bin Laden and describes the night his mission came to a dramatic conclusion.
Next, Katy speaks with Barbara Mellers about research that shows how so-called superforecasters make more accurate predictions despite facing uncertainty and conflicting information.
You can read more in the paper titled "Identifying and Cultivating Superforecasters as a Method of Improving Probabilistic Predictions."
Barabara Mellers is the I. George Heyman University Professor of both marketing at the Wharton School and of psychology at the School of Arts and Sciences at the University of Pennsylvania.
If you enjoy the show, please leave a rating or review on Apple Podcasts.
Learn more about behavioral finance.
More from Charles Schwab
How Do You Plan for the Succession of a Business?
Rebel With a Cause: With Guests Francis Kelly & Christopher Bryan
How Can You Plan for a New Pet?
Related topics
All expressions of opinion are subject to change without notice in reaction to shifting market conditions.
The comments, views, and opinions expressed in the presentation are those of the speakers and do not necessarily represent the views of Charles Schwab.
Data contained herein from third-party providers is obtained from what are considered reliable sources. However, its accuracy, completeness or reliability cannot be guaranteed.
The policy analysis provided by the Charles Schwab & Co., Inc., does not constitute and should not be interpreted as an endorsement of any political party.
All corporate names are for illustrative purposes only and are not a recommendation, offer to sell, or a solicitation of an offer to buy any security.
Investing involves risk, including loss of principal.
The book How to Change: The Science of Getting from Where You Are to Where You Want to Be is not affiliated with, sponsored by, or endorsed by Charles Schwab & Co., Inc. (CS&Co.). Charles Schwab & Co., Inc. (CS&Co.) has not reviewed the book and makes no representations about its content.
Apple Podcasts and the Apple logo are trademarks of Apple Inc., registered in the U.S. and other countries.
Google Podcasts and the Google Podcasts logo are trademarks of Google LLC.
Spotify and the Spotify logo are registered trademarks of Spotify AB.
0623-3UG1