KATY MILKMAN: 1935. Wright Airfield in Dayton, Ohio. It was late October. Boeing was demonstrating the prowess of their next-generation long-range bomber to the U.S. Army Air Corps. Their Model 299 Flying Fortress was a massive plane, with four powerful engines and a giant 100-foot-plus wingspan. The plane lifted off with a thunderous sound, climbed to several hundred feet, and then all of a sudden it stalled, dipped to one side, and fell back to Earth in a fiery explosion. Two of the five crew members died in the crash, including the pilot, Major Ployer “Pete” Hill. An investigation found the cause of the crash to be pilot error. He’d forgotten to release a lock on the controls. The thing is, Major Hill was an exceptionally qualified and experienced pilot. He was the chief of flight testing for the Air Corps. How and why had he forgotten this step?
Today on the podcast, we’re looking at a simple, common tool that came into widespread use due to that tragic event, one that can improve outcomes on everything from a college application …
SPEAKER 2: Argh, so many forms.
KATY MILKMAN: … to a perilous voyage to the moon.
JACK SWIGERT: OK, Houston. We’ve had a problem here.
KATY MILKMAN: I’m Dr. Katy Milkman, and this is Choiceology, an original podcast from Charles Schwab. It’s a show about the psychology and economics behind our decisions. We bring you true stories involving high-stakes, make-or-break moments, and we explore the latest research in behavioral science to help you make better choices and avoid costly mistakes.
ANDREW CHAIKIN: With Apollo 13, the real focus was on exploring the moon and finding out what it could tell us about the earliest history of the solar system.
KATY MILKMAN: This is Andrew Chaikin.
ANDREW CHAIKIN: I’m Andrew Chaikin, space historian and author of the book A Man on the Moon: The Voyages of the Apollo Astronauts.
KATY MILKMAN: 2020 marked the 50th anniversary of Apollo 13. It was NASA’s seventh crewed mission in the Apollo program and the third to attempt to land astronauts on the moon, after successful landings by Apollo 11 and 12.
ANDREW CHAIKIN: And on the crew patch for the mission was the motto Ex Luna, Scientia, which means, “From the moon, knowledge.” So expectations were very high that Apollo 13 would open this new chapter in the scientific exploration of the moon.
KATY MILKMAN: You may know this story already. Maybe you saw the Hollywood movie, or maybe you experienced the event as it happened. It remains one of the most dramatic illustrations of the complexity and danger that astronauts and engineers face during space travel. We’re going to focus on a few crucial moments in the mission and a procedural tool that made a huge difference to the outcome. But first, some context.
Apollo 13 launched from Kennedy Space Center on April 11th, 1970, with three astronauts aboard.
SPEAKER 5: Three, two, one, zero. We have commit and we have liftoff at 2:13.
ANDREW CHAIKIN: Jim Lovell, the commander, was absolutely unflappable, always even keel, never got ruffled.
KATY MILKMAN: Tom Hanks played the Jim Lovell character in the movie Apollo 13.
ANDREW CHAIKIN: The command module pilot, Jack Swigert, knew the command module systems as well as just about anybody, and finally, the lunar module pilot, Fred Haise, was also just an incredible expert on the lunar module, one of the top two or three astronauts in the program in terms of his knowledge about the lander.
KATY MILKMAN: The takeoff and flight went smoothly, but then …
ANDREW CHAIKIN: You get to 55 hours and 56 minutes into the flight of Apollo 13. Mission Control sends up a request to the astronauts, “We’d like you to stir the tanks.” What they said was “Stir the cryo tanks,” cryo- meaning cryogenic liquid oxygen and liquid hydrogen. That made it easier for the flight controllers to gauge the quantity levels inside the tanks. Swigert flipped the switch to stir the tanks, and moments later, the astronauts heard a loud bang and felt the spacecraft shudder, and that’s when all the problems started.
KATY MILKMAN: Fred Haise, the lunar module pilot, had a sinking feeling in the pit of his stomach.
ANDREW CHAIKIN: Because he looked at the gauges, he saw that the oxygen reading in tank two was going down to zero and the fuel cells starting to fail. He knew in that moment that they were not going to land on the moon, because the mission rules forbid it.
KATY MILKMAN: Despite this crushing disappointment, not to mention the deep concern for their own safety, the Apollo astronauts got straight to work.
ANDREW CHAIKIN: They have the phrase, “Work the problem.” That’s what they do. They work the problem, and so, even as they were still trying to figure out what was going on, they were just doing the right procedures, they were talking to Mission Control, they were not losing their cool, and that’s an extraordinary thing.
As soon as the bang happened, Fred went back to his couch on the right side of the command module to start investigating what was happening, and at one point pretty shortly after that, he glanced out his side window and he saw all kinds of sparkling particles, debris, and frozen gas and so forth. He knew at that moment that there had been a serious physical problem, that this was not just some kind of instrumentation glitch. He also knew that from looking at the instrument panel because different sensors were telling him the same thing, that the oxygen tank was losing its contents into space.
KATY MILKMAN: That’s when Mission Control in Houston heard the iconic words from Apollo.
JACK SWIGERT: OK, Houston. We’ve had a problem here.
SPEAKER 6: Flight, guidance.
SPEAKER 7: Go guidance.
SPEAKER 6: We’ve had a hardware restart. I don’t know what it was.
JIM LOVELL: Houston, we’ve had a problem. We’ve had a MAIN B BUS UNDERVOLT.
FRED HAISE: And we had a pretty large bang associated with the caution and warning there.
ANDREW CHAIKIN: Neither he nor Lovell or Swigert said anything about the debris and the shiny little droplets out the window to Mission Control for quite a few minutes, and it wasn’t until they heard Lovell finally say, “Looking out the window, we appear to be venting something into space,” that’s when they really had to confront the reality that, no, this was not an instrumentation problem. This was a really serious something that had happened to the spacecraft.
KATY MILKMAN: The explosion in the cryo tanks meant the crew was rapidly losing the ability to generate power in the fuel cells for the command module. The command module is the nerve center of the craft, and it was designed to be home to the crew during the long voyage to and from the moon. Now, that module was failing fast.
ANDREW CHAIKIN: The first thing they had to do was get into the lunar module and start getting it powered up to serve as their lifeboat. The explosion made it only a matter of time before the mothership was dead.
KATY MILKMAN: There are three main modules on the Apollo spacecraft: the command module; the lunar module, used for landing on the moon; and the service module. The service module contained the damaged cryo tanks. The command module is the only part of the craft that returns to Earth, so the battery power in that module had to be preserved at all costs. The only option was for the astronauts to retreat to the lunar module while they shut down the command module and planned their return trip. At this point, they only had a few minutes of power left in the command module.
ANDREW CHAIKIN: So there was incredible time pressure here, and normally the procedure, the checklist to power up the lunar module, all of those steps takes a couple of hours, because you’re not under time pressure. You’re in orbit around the moon. You’re still attached to the mothership. You’re going through a lengthy checklist to turn on all the systems before you undock from the mothership and go land on the moon. Well, now they had to do that two-hour procedure—they had to boil it down to about 15 minutes.
KATY MILKMAN: Imagine the pressure, and the stakes couldn’t have been higher. The astronauts needed to get this right, and fast.
ANDREW CHAIKIN: And so this is one of the most remarkable things in the whole episode. The guys who were experts in the lunar module systems in Mission Control, they were busy working on a shortened checklist to read up to the astronauts. The knowledge that they had of the systems and how they worked and what had to be done when, that was as crucial as anything else. They got that checklist read up to the astronauts bit by bit, a few lines of instructions here, a few lines there, over the space of many minutes. Fred Haise said that he was in the spacecraft with the checklist in front of him, and he had a felt-tip pen, and he was just crossing out big sections that they weren’t going to do and focusing on the stuff that they did have to do.
Probably the most tense moment in this whole process was Jim Lovell copying down the numbers from the command module computer, which were called down to him by Jack Swigert.
KATY MILKMAN: These numbers included the ship’s orientation. The problem was, for technical reasons, the command module’s orientation didn’t match the lunar module’s, so as they were copying these coordinates from one checklist to the other, they had to perform arithmetic to correct the difference.
ANDREW CHAIKIN: And you know, this was simple arithmetic, but Lovell wanted to be sure that he didn’t screw up in the heat of the moment, and he asked Mission Control, “Hey guys, can you double-check my arithmetic?”
KATY MILKMAN: The orientation was absolutely key, as they needed those numbers to calculate their position in space. They would be lost if there were any errors.
ANDREW CHAIKIN: The lunar module guys on Mission Control were sitting there adding and subtracting the numbers themselves by hand, just to double-check what Lovell had done, and they managed to get it into the lunar module computer in time before the command module had to be shut down to conserve some amount of power in the batteries, and they took it from there.
KATY MILKMAN: Of course, they weren’t out of the woods yet. The crew still had to get their damaged craft back to Earth. They needed to conduct two major engine firings. The first firing was meant to get them back on a free-return trajectory, using the moon’s gravity to slingshot them around and back on a flight path to Earth. The second firing was meant to increase their velocity and get the ship back home faster. You see, the crew was rapidly running out of consumables: power and oxygen. The lunar module was designed to support only two astronauts for a brief landing on the moon, not three astronauts on a multi-day trip back to Earth.
ANDREW CHAIKIN: The first basic need was to get back on the free-return trajectory. They didn’t have to think, “Oh gosh, how do we do that?” They already had in their hip pocket a procedure for using the lunar module engine to propel the two spacecraft docked together, and that was what they knew as soon as they realized that the lunar module was going to have to be the lifeboat. As soon as they realized that, they knew that that was the first thing they had to accomplish.
KATY MILKMAN: That basic plan in the form of a procedural checklist gave the astronauts step-by-step direction for this contingency, but it also freed up their mental bandwidth to concentrate on those issues that had not been addressed in the contingency plans.
ANDREW CHAIKIN: Fred Haise said that it occurred to him in real time that there was no book to pull off the shelf that would cover everything they would have to do to get home, and he knew that the people in Mission Control would be working around the clock to figure out many of these procedures.
KATY MILKMAN: It was a combination of real-time ingenuity and a solid foundation of contingency planning. NASA engineers and astronauts spent a great deal of time in advance thinking through what-if scenarios and codifying procedures to deal with them, and now, they were dealing with several worst-case scenarios all at once, from a tiny ship hurtling through space.
ANDREW CHAIKIN: It was a tight fit for two astronauts, and for three, of course, it was even closer quarters. At one point, for much of the trip home, the only systems that they had powered up were the cabin fans to circulate oxygen and the radio. So they’re in this tiny little cabin in the dark. Got down to the low 50s in the lunar module, and up in the command module, where of course everything had been turned off since the accident, it got much colder. At one point, Lovell went up to the food pantry in the command module to get some hot dogs and brought them down to the lunar module, and they were pretty much frozen solid.
KATY MILKMAN: The final and most dangerous portion of the journey was still to come: re-entry. To re-enter the Earth’s atmosphere, they had to essentially reboot the command module.
ANDREW CHAIKIN: Nobody has ever turned a spacecraft off in flight and then powered it back up. How the heck do you do that? Can it even be done? And to complicate matters, it had to be done with a very limited amount of electrical power, because all they had to go on were the batteries in the command module.
KATY MILKMAN: It had taken several days for Mission Control to come up with a solution and test it.
ANDREW CHAIKIN: And at the end of that whole process, they came up with a checklist that didn’t bust the power budget, and it took about two hours just to read it up to the astronauts. It was a long thing with many, many, many steps, and of course, when it came right down to it, would it work? Well, they wouldn’t know until they did it.
KATY MILKMAN: To everyone’s relief, the command module flickered back to life, and they jettisoned the dead service module. As it floated past, the crew was able to see that an entire panel had been blown off during the cryo tank explosion. Next, they jettisoned the lunar module and prepared for re-entry into the Earth’s atmosphere, hoping against hope that the command module’s heat shield had not been damaged.
The descent began, and the astronauts were buffeted by incredibly intense heat and physical forces. The ionization of the air around an Apollo command module during re-entry typically resulted in a four-minute communications blackout, but Apollo 13 was on a more shallow re-entry path, which lengthened the blackout. Two whole minutes of additional radio silence led Mission Control to fear that the command module’s heat shield had failed.
SPEAKER 10: Odyssey, Houston. We show you on the mains. It really looks great.
SPEAKER 11: An extremely loud applause here in Mission Control.
SPEAKER 12: Mission Control, pretty good.
KATY MILKMAN: The crew regained radio contact and splashed down safely in the south Pacific Ocean. They had done it. Fred Haise, Jim Lovell, and Jack Swigert had beaten the odds, and with the help of their colleagues at Mission Control in Houston, they had brought a damaged ship back to Earth and back from the brink of disaster. Apollo 13 came to be known famously as a successful failure.
ANDREW CHAIKIN: It’s funny, I talked to Jim Lovell about how he looked back on Apollo 13, and he said, “You know, if I had to do it all over again, I still would like to have landed on the moon,” but after that, he said, “I would do Apollo 13 all over again the way it happened, because it was the ultimate test of a test pilot.” That right there was pretty remarkable.
KATY MILKMAN: Andrew Chaikin is a space historian and author of the book The Man on the Moon: The Voyages of the Apollo Astronauts. You can find links to his book and to more information on the Apollo program in the show notes or at schwab.com/podcast.
Space travel was and continues to be incredibly complex and dangerous. The Apollo 13 crew along with their colleagues at NASA Mission Control demonstrated remarkable ingenuity and grace under pressure during their perilous journey around the moon, and while long hours of intense training played a crucial role in their survival, the crew also relied heavily on step-by-step procedures in the form of checklists.
Checklists helped the team simplify complicated tasks and avoid errors in a situation where the consequences of a mistake were extreme. Good astronauts understand that they’re human and subject to mistakes, which makes checklists a critically important tool. Checklists can also help prevent disagreements. When that cryo tank exploded in the service module, there was no debate about whether or not the astronauts could continue with their plan to land on the moon. The mission rules checklist stated that if you lost a tank or fuel cell on the way to the moon, a landing was forbidden.
Incidentally, in 2011, the lunar module activation checklist from aboard Apollo 13 was sold at auction for over $375,000 … well, at least before NASA halted the sale due to an ownership dispute, but its value gives you an idea of how important checklists were to the outcome of that mission.
Checklists have been a standard part of operating procedures in aviation since Major Hill made the critical mistake that led to the Boeing bomber crash of 1935. The Army Air Corps determined that he’d had too much on his mind and simply forgot to release the control lock. The Air Corps subsequently instituted the use of pre-flight checklists written on 3x5 cards that helped ensure pilots and crew members followed the correct procedures for a safe takeoff and flight.
As airplanes have become increasingly complex, the likelihood of human error has also increased dramatically. Checklists became a way of breaking down complex processes into manageable actions, and offloading cognitive effort so that pilots, and eventually astronauts and others faced with complex tasks, could focus on the most pressing concerns without fear of missing crucial steps.
If you’ve read Atul Gawande’s bestselling book The Checklist Manifesto, you’ll be familiar with just how important checklists have become to the successful practice of medicine. Research by Gawande and others suggests that surgical checklists, for instance, can cut complications and mortality rates by more than a third.
My next guest has conducted important research showing how checklists can affect worker productivity—and why some workers are nonetheless reluctant to use them. Kirabo Jackson is the Abraham Harris Professor of Education and Social Policy at Northwestern University.
Hi, Kirabo. Thank you so much for joining me today.
KIRABO JACKSON: Thank you for having me, Katy.
KATY MILKMAN: So I’ve shared the story with our listeners of Apollo 13 and how important checklists were to facilitating a safe return home for the astronauts onboard that mission, and obviously saving lives in space is important, but it feels pretty far removed from most people’s day to day. You did a study about the effects of checklists in a much more down-to-earth setting, and I was wondering if you could describe that research.
KIRABO JACKSON: Sure. So my coauthor and I, Henry Schneider, we ran sort of a very small experiment in auto mechanic shops. The idea behind the experiment was, perhaps checklists would be a really helpful way to improve the productivity of the mechanics. The idea here came from the checklist literature, which a lot of this has been in sort of medicine, is that checklists can potentially provide a sort of external memory device that makes sure that people stay on track, they don’t lose concentration when they’re going through things, you don’t miss steps. So that’s one of the ways in which it can help things. Other ways in which it can help things is that it also just provides some external accountability. If you have a checklist, everyone has to go through the checklist. Everyone understands that once the checklist is filled out, it’s been done, so it also provides some external accountability to ensure that whoever’s doing the task is doing it properly.
So the idea was that we’re going to roll out this experiment in a small number of auto mechanic shops. What we did was we provided checklists to the shop owners, and we asked them to introduce them in the shops that they had. The checklist was basically a very standard auto mechanic list that listed the kinds of components you would check, and it basically goes through all the various components of the car that a mechanic would want to look at when they want to just make sure they’re doing a thorough once-over of the car to identify any problems that they would want to potentially fix.
So we provided them to the owners, the owners offered them to some of the shops for a three-week period, and we basically looked at their data to see what happened during those three weeks. We used a set of shops that were not offered the checklist as a basis for comparison, and we went to basically see what happened.
KATY MILKMAN: And what did you find?
KIRABO JACKSON: So what we found was, at the end of the period, that total revenues went up in these shops. They went up by about 25%, which is actually a pretty big increase in the overall revenues generated by the shops, so during these three weeks, the auto mechanics basically made more money for their owners. We were able to use data from before and during the experiment, where some mechanics actually were paid by commission, and the finding there was that just providing these checklists that listed the components of the cars that the mechanics should check increased revenue by about the same amount as about a 1.5 percentage point increase in the mechanic’s commission, which is a pretty large effect, and it’s relatively cheap. To increase someone’s commission by 1.5 percentage points, you have to basically pay for that, versus the checklist which was relatively cost-free. So it was actually a very cost-effective approach to increasing revenue for the shop.
KATY MILKMAN: That’s really interesting, and I’m wondering if you have any thoughts on why checklists aren’t used much more widely given how incredibly beneficial you found they were, and at such a low cost.
KIRABO JACKSON: So in our context, what we basically concluded based on the data is that there’s essentially a moral hazard. So moral hazard sort of relates to a whole set of situations where an individual bears the cost of some action, but the benefits accrue to somebody else. That can lead to a moral hazard. In this context, the mechanic is the one that has to exert the effort to go through the checklists. Maybe it could actually be kind of a pain in the butt. They may have to check some components they don’t necessarily want to check, and they’re not getting paid additional money to use the checklist. So it does potentially impose an additional cost on the mechanic, and all the benefits of this more thorough inspection are essentially going to the owners. Ideally what you’d want is to compensate the workers for the additional effort that the checklist entails, and split it in such a way that the owner gets more money because they’re getting more revenue, and the mechanic is happy doing the additional work because they’re being compensated for doing so.
KATY MILKMAN: That’s really helpful, and I also just want to draw … we’ve talked a little bit about why you think checklists work, and it seems like what you’re highlighting is that they reduce forgetting and create accountability, and you’ve also mentioned that you think they’re particularly likely to come in handy in complex situations. Is that sort of your rule of thumb for when you think a checklist is most likely to be helpful, when the situation is complex, or are there other variables that you think matter too?
KIRABO JACKSON: I mean, my sense is that, certainly in terms of the memory aid component, when things are complicated, those are the kinds of scenarios where people forget things, so I think that’s exactly right. The other time where I think it’s really valuable is when things have to happen step by step, and they’re multi-step processes. It can just be difficult to keep track of a whole bunch of different steps unless you write things down and sort of make sure you’re staying on task. So I think multi-step situations that are complex are exactly the kind of scenario where you’re going to want to use those things.
One of the nice things to think about checklists as well is that many jobs have a piece that is complicated or diagnostic, and another piece which requires the exertion of effort or acting on the checklist. I think by externalizing the keeping-track-of-things piece, you can basically focus a lot more of your effort on sort of making sure you do the things that you need to do, if that makes sense.
KATY MILKMAN: That makes so much sense. Kirabo, thank you so much for joining me. This was great.
KIRABO JACKSON: Oh, it’s my pleasure. Thank you.
KATY MILKMAN: Kirabo Jackson is the Abraham Harris Professor of Education and Social Policy at Northwestern University. I have a link to his paper on checklists and worker behavior in the show notes and at schwab.com/podcast.
Checklists are a tool for anyone worried about what happens to the quality of their decisions when systems and situations become complex. Checklists help simplify and streamline our approach to solving problems that might otherwise prevent us from using airplanes, spaceships, operating rooms, businesses, and more, safely and successfully. We need them because we have a limited capacity to process and remember key information, particularly under stress.
But checklists aren’t the only tool you can use if you recognize the need for simplification. Cass Sunstein, my next guest and author of the books Simpler and Too Much Information, knows that better than most. Cass has devoted enormous time and energy to simplifying government programs and regulatory policies to improve outcomes for Americans. He’s done that by adding checklists, reducing paperwork, and streamlining systems. Cass joined me from his home in Massachusetts.
Hi, Cass. I really appreciate you joining me. Thank you so much for taking the time.
CASS SUNSTEIN: Thanks to you, Katy. It’s a great pleasure to be here.
KATY MILKMAN: First, can you talk a little bit about why simplifying processes by reducing paperwork and writing checklists can be so important when we’re in government or organizations, or generally?
CASS SUNSTEIN: Yes, well, just let’s take an old-fashioned idea about people, which is that they run costs and benefits, and they think, “Is it worth it?” They make a rational, intuitive judgment. “Should I do the thing?” If the thing is complicated, the rational judgment might be, “No, I’m not going to do that thing, because it’s too hard or because it’ll make me sad.” That might mean that they won’t sign up for something. That might be economic benefits or a health program that would really be great, but they’re rationally deciding it’s just too much bother.
Now, if we add to it that human beings often suffer and sometimes benefit from foibles, little behavioral findings, we can call them foibles if you like, such as, people might be present biased. They might think today really matters, and next year is a foreign country, Laterland, I’m not sure if we’re ever going to be there, or they might be unrealistically optimistic. They might think, “Well, I won’t do it this week, but surely I’ll do it next week,” or they might have a kind of scarcity in their head of things they can attend to, especially if they’re poor or really busy or impaired in some way. Maybe they’re older. Maybe they’re depressed. If those things are reducing their bandwidth, then the fact that something is complicated may mean that they’ll think some version of “Forget about it.”
KATY MILKMAN: I totally am with you on this and think this is so important. I was wondering if you could talk a little bit about some of your favorite research that illustrates the power of simplification to improve lives and outcomes.
CASS SUNSTEIN: Yeah. So we know that to get financial aid, if you’re a high school senior, you have to fill out a form, and for a long time, the form was crazy complicated. At the time that it was crazy complicated, including requiring high school juniors and seniors to find out their parents’ tax information, at that time, a lot of people just didn’t get financial aid to which they were entitled, not because of present bias or unrealistic optimism, just because the form was not feasible to navigate. A study showed that if you simplified the form, you can have the same effects in terms of increasing college enrollment as you can have by increasing the subsidy by thousands of dollars.
So pause over that, two ways to increase people’s ability to college if they’re poor. One is simplify the form. The other is give them, each one, several thousand dollars more money. Now, the form simplification is a lot more cost-effective.
KATY MILKMAN: It’s just so interesting that paying people to complete forms that will give them money is actually less effective than simply making the forms easier to complete.
CASS SUNSTEIN: Yeah.
KATY MILKMAN: So another way of dealing with the issue of complexity is giving people checklists. Is that something you’ve worked on at all?
CASS SUNSTEIN: Yes, so I have a story. It’s not quite a study, but it has drama associated with it, and I’m going to tell it for the first time here, so here you go.
When I was in the U.S. government, all agencies of the federal government then, as now, are governed by a document from the Office of Management and Budget, which is over 50 pages, and which is about producing regulatory impact analyses. Now, that might seem like a dull and kind of technical matter, but if any agency—it might be under President Obama or President Trump, President Bush, President Clinton—is regulating, let’s say, highway safety or food safety or occupational safety or climate or something involving immigration, it has to do this analysis.
The analysis is often decisive to the outcome, but 52 single-spaced pages of technical material is pretty hard to absorb, even if this is your business, so I produced a checklist. It’s called the Regulatory Impact Analysis Checklist, and it’s available online and it’s really short. I produced it with my little fingers just summarizing the highlights of the 52-page document. It’s used every day by agencies in the federal government to figure out what they’re supposed to do when they decide what a rule should be involving, let’s say, occupational health. While some of these issues are really high-profile things that end up on the front page of the newspaper, and the technical analysis may or may not be crucial to what happens, often, in ways that will never make the newspapers, the well-being of people all over the country is improved because there’s a checklist there, which allows bureaucrats to figure out what they’re supposed to attend to.
KATY MILKMAN: So I want to take this from the realm of government to actually people’s personal lives, because I bet many of our listeners are going to be wondering about not how to better regulate government, but how to better regulate themselves. So if someone were worried about making mistakes in their own life when they’re facing complex systems or forms, what kind of advice would you have for them about how to set themselves up for success?
CASS SUNSTEIN: First, take a deep breath. Second, if it’s a complex matter, put it into component parts, and it gets a lot less scary that way, and maybe put a little list of the five things that are the components. So if one has a task, it might be … in a company, it might be … in a nonprofit, it might be … as a student, the task itself probably has a name, which is somewhere between daunting and terrifying, and if you’ll break it up into the six things that the name is camouflaging, then it can be much more attractive.
If that six things, let’s say, now it might be a little harder than that, it might be 14 things, can be broken up in that way, maybe there’s one that can be done on a Tuesday, another on a Wednesday, another on a Thursday. Once you’ve done the two on the first days, then you’re one-seventh of the way through. That’s a very cheering thought and can motivate continuation. So even an informal kind of in-the-mind checklist, with respect to daunting tasks, can be liberating.
KATY MILKMAN: That’s really interesting, and it’s also interesting to me, Cass, that the way you’ve talked about simplification is actually a little different than the way I’ve thought about it, so I want to dig in for just a second there. I used to think, before this conversation, “OK, simplification is really about trying to reduce forgetting,” but it seems like you’re also using it as a way to avoid present bias and procrastination and feeling overwhelmed by large tasks. Am I thinking about that right?
CASS SUNSTEIN: Yes. I mean, your way of thinking about it, of course, is completely fundamental, but I would include the others also. There’s a kind of architecture that your electricity provider or your internet provider might offer you, which can be complicated and messy and hard to navigate or not. There’s an architecture that, let’s say, a motor vehicle department might provide when you’re renewing your license, or that the criminal justice system might provide. Those can all be really complicated, and the way of simplification just involves scaling back or redoing the architecture, but we all in our individual lives can alter architecture. We can decide to automate things—like so that all the bills are paid automatically, and we don’t have to think about them—or we can do something like cover the next half-hour with respect to our schedule so that we don’t have to think as much about our schedule, because every meeting that’s going to occur on Wednesday at 2:00 p.m. is automatically in the schedule.
That’s a kind of trivial example, but there are big examples that I think each of us can probably identify that involve changing our architecture that can pay really long-term dividends. That does involve something like reminders, but it can involve something like just overcoming present bias.
KATY MILKMAN: I love that you brought up architecture given your important work on choice architecture and the way we structure our decisions and how much that matters. This was so wonderful. Thank you very much for taking the time to talk with me about this.
CASS SUNSTEIN: Well, these have been very challenging questions, but at least I haven’t had to put words on a page.
KATY MILKMAN: You’re pretty good at that, so you would have been fine either way. Thank you, Cass. I appreciate it.
CASS SUNSTEIN: Thank you. Great pleasure.
KATY MILKMAN: Cass Sunstein is the Robert Walmsley University Professor at Harvard Law School. He’s also the founder and director of the Program on Behavioral Economics and Public Policy at Harvard Law School, former administrator of the U.S. government’s Office of Information and Regulatory Affairs, and author of many books. His latest is called Too Much Information: Understanding What You Don’t Want to Know. You’ll find links to his book and more in the show notes or at schwab.com/podcast.
Breaking down your financial goals into clear, discrete tasks is an important step in creating an effective financial plan. This also helps you think about what could go wrong—and how you might respond to stay on track. Check out the Financial Decoder podcast episode titled “Are Financial Plans Just for the Wealthy?” to hear more about overcoming common barriers to getting started with a plan. You can find it at schwab.com/financialdecoder or wherever you listen to podcasts.
The topic of today’s episode hearkens back to the origins of the field of behavioral economics. It’s arguable that 1978 Economics Nobel laureate Herbert Simon laid the groundwork for everything that followed when he highlighted the limitations of people’s reasoning capacity in the 1950s. Simon’s background in computer science helped him see the brain as akin to a computer, with limited memory and limited processing power. Once these limitations were recognized by other economists, it was easier to start thinking about the systematic and predictable ways in which people deviate from making optimal decisions. Checklists, and simplification more generally, are critical to helping people who have limited time, memory, attention, and willpower. So the next time you encounter a complex task, consider whether and how you can simplify it, and if it’s one you or others might have to repeat, ask yourself whether creating a checklist could help reduce errors and improve productivity.
You’ve been listening to Choiceology, an original podcast from Charles Schwab. If you’ve enjoyed the show, we’d be really grateful if you’d leave us a review on Apple Podcasts. You can also subscribe for free in your favorite podcasting apps, and if you’d like to get my newsletter to learn more about behavioral science, you can sign up at katymilkman.com/newsletter. In the next episode, we’ll look at a curious phenomenon where people avoid information that may actually help them. I’m Dr. Katy Milkman. Talk to you soon.
SPEAKER 15: For important disclosures, see the show notes or visit schwab.com/podcast.