Speaker 1: White and gold.
Speaker 2: It's gold and white.
Speaker 3: That's what I see anyway.
Speaker 4: No way. That's black and blue.
Katy Milkman: You may remember an internet phenomenon from a few years back. It was a viral debate about whether a dress was blue and black or white and gold.
Speaker 6: Are you team black and blue or are you team white and gold?
Speaker 7: Before you get in a punch up and turn someone's arm black and blue, let's settle the crucial question.
Katy Milkman: People just could not agree, but many were convinced that the colors they saw were the true colors of the dress and anyone who disagreed just wasn't seeing the world accurately. They were sure the other people were wrong, ignorant, or maybe even lying. Each group believed their own perception was the objective truth.
Speaker 8: It is blue and black.
Speaker 9: It's not. It's white and gold. You can see it's white and gold. It's white and gold.
Speaker 8: No, it's not.
Katy Milkman: But of course, context, differences in perception, even differences in the screens people use to view the image all had an effect on what they saw. In this episode, we'll look at how we can easily be fooled into believing that our subjective experience of the world is objective and believing that anyone who disagrees with us must be biased or wrong.
Speaker 10: It's a blue and black and yellow and white dress.
Speaker 11: If I look at a different color for long enough …
Speaker 10: Oh my gosh. No, I'm seeing it differently now.
Speaker 12: Oh my gosh, it hurts my head.
Katy Milkman: I'm Dr. Katy Milkman, and this is Choiceology, an original podcast from Charles Schwab. It's a show about the psychology and economics behind our decisions. We bring you true and surprising stories about high-stakes choices, and then we examine how these stories connect to the latest research in behavioral science. We do it all to help you make better judgments and avoid costly mistakes.
Ken Adelman: Following Ronald Reagan was an army colonel holding a briefcase. In that briefcase were the nuclear codes that the president would use to blow up the Soviet Union.
Katy Milkman: This is Ken.
Ken Adelman: Hi, I'm Ken Adelman.
Katy Milkman: Ken was a U.S. ambassador to the United Nations and, later, arms control director in the Reagan administration. He's describing a scene in Reykjavik, Iceland, in 1986.
Ken Adelman: And following Mikhail Gorbachev was a Soviet colonel carrying a briefcase with the nuclear codes to blow up the United States.
Katy Milkman: Ken was in Iceland at the second of four summits held between Reagan and Gorbachev between 1985 and 1988. The purpose of the summits, which were held for the leaders of the United States and the Soviet Union, was to try to ease Cold War tensions and reach an agreement around nuclear arsenals.
Ken Adelman: It was a come-as-you-are summit. It was a surprise summit. It was going to be what we call "grip and grin," that the president would have these photo opportunities with Mikhail Gorbachev. Gorbachev would be happy about it, Reagan would be happy about it, and we'd go home.
Katy Milkman: But what started as a surprise summit quickly turned into something more substantial, more serious. If you lived through the 1980s, you'll remember that the threat of nuclear war loomed large. Movies like The Day After put devastating images of a nuclear holocaust in the public consciousness, and songs like "Russians" by Sting or "Land of Confusion" by Genesis kept global tensions top of mind.
Ken Adelman: And there was a real worry that the world was going to blow up on nuclear weapons, and when Ronald Reagan took over in 1981, it was a very, very tense time in the Cold War. And Reagan came in, and that was the number one issue in the world. Will we prevent any kind of nuclear holocaust? And is Ronald Reagan, this kind of wild cowboy from the West, capable of dealing with the Soviet leader?
Katy Milkman: Reagan, the film star turned politician, was vehemently anti-Soviet.
Ken Adelman: He couldn't stand their ideology of communism because he thought that freedom was a God-given right to every person on Earth.
Katy Milkman: According to historians, Reagan was extremely confident in his views. The way he saw the world, particularly how he viewed the Soviet Union, it seemed to follow that any objective person would see it the same way. Reagan thought that the Soviet Union would eventually collapse because communism was such an obviously flawed and unsustainable system, and he believed that the people who ran that system must be evil.
Ken Adelman: Reagan had said beginning in his first press conference as president of the United States in the White House right after he took office that the Soviets will lie, cheat, steal to expand their power everywhere in the world. He said that the Soviet Union was an evil empire. He said that it's the focus of evil in the modern world. It was a relentless public affairs and public diplomacy pitch of Ronald Reagan that the Soviet Union was destined to fail.
Katy Milkman: Gorbachev had his own set of deeply held beliefs. While he felt the Soviet system was imperfect, he thought it could be reformed, and he saw Reagan's rhetoric as dangerously misguided and simplistic. According to experts on Gorbachev, he felt Reagan's rhetoric threatened the fragile peace between the Soviet Union and the United States and endangered the world.
Ken Adelman: His feeling was that the United States was implacably against the Soviet system and that Ronald Reagan was a dumbbell who really didn't understand the issues very well at all.
Katy Milkman: And just like Reagan was anti-Soviet and anti-communist, Gorbachev was anti-American and anti-capitalist. These men were at opposite ends of the ideological spectrum. Their differences were profound, and they brought these differences with them to their very first meeting, which was in Geneva in 1985.
Reporter: Mr. Reagan, how's the meeting going?
Ronald Reagan: We haven't started.
Reporter: Well, how'd it go yesterday?
Ronald Regan: Fine.
Reporter: Are you getting along?
Ronald Regan: You can see that. Can't you?
Reporter: Now that's a picture. Tell us.
Katy Milkman: That first summit was a highly-produced, highly-scripted encounter.
Ken Adelman: We had spent about six months getting ready for the summit and preparing everything.
Ronald Reagan: Well, again, anything. As I say, we've agreed that we won't be doing any reporting until the meetings are over.
Katy Milkman: But the second summit in Iceland was a very different affair.
Ken Adelman: For Reykjavík, in October of 1986, our planning took about 12 or 14 days, so our preparation was very slim. You could say it was almost irresponsible that so little was done to prepare the president for the summit, but we didn't think he needed to be prepared.
Katy Milkman: Remember, this was supposed to be largely ceremonial, a photo op that might lay the groundwork for future summits, but it turned out to be much more.
Ken Adelman: The Reykjavík Summit was very unusual. It was in a windblown part of the city at a time when the weather changed almost hourly. It was done in a very small house called the Hofdi House. It was thought to be a public affairs event for Gorbachev to elevate his standing in the Kremlin and not going to be a very serious affair.
Katy Milkman: But when they arrived in Iceland, the two leaders decided to take advantage of the time they had together, to see if they couldn't make some progress.
Ken Adelman: The basement of the Hofdi House was divided in half between the CIA on one half and the KGB on the other half, and they were all crammed together in this little house with their equipment to convey the nuclear orders around the world and whatever other message the president or the general secretary wanted to use.
Katy Milkman: Over that weekend, in October of 1986, the two most powerful men in the world met and talked together for over 10 hours without notes, without much preparation or guidance from their staff. Just two men in direct and candid conversation.
Ken Adelman: These were two men that were really flying solo, each of them. It really revealed to me, revealed what their inner beliefs on the most important issue of their time in a way that was unvarnished.
Katy Milkman: The stakes were incredibly high.
Ken Adelman: Reagan and Gorbachev were in a very small conference room talking about nuclear weapons. Outside the room were these two army officers holding their briefcase full of the nuclear codes, and when each session broke, they followed their Soviet leader or the American leader. So that was pretty eerie and kind of spooky.
Katy Milkman: Declassified transcripts later revealed that …
Ken Adelman: Ronald Reagan constantly made comments and made statements about the nuclear issue that were quite contrary to American position, including the position that was taken in the Reagan administration, and I know that Mikhail Gorbachev made comments that were absolutely different from what people in his Politburo or especially Ministry of Defense thought. So these are the two men unabashed and undisguised, getting to the real core of what these men were about.
Katy Milkman: Reagan, who had always been distrustful of the Soviets, started to realize that Gorbachev wasn't just an ideologue. He was someone who genuinely wanted peace and stability. And Gorbachev in turn recognized that Reagan's fear of nuclear weapons wasn't just rhetoric. It was a deeply personal conviction shaped by his own worldview. By engaging in direct communication and allowing themselves to understand each other's motivations, they made significant progress. Of course, dialogue between the two men could only get them so far. The next steps fell to the Soviet and American diplomatic teams. Ken and his colleagues worked through the night with their Soviet counterparts.
Ken Adelman: I met the president at 8:00 in the morning, and I reported to him, "We accomplished more on nuclear weapons just last night than we had in seven years of constant negotiations with the Soviet Union."
Katy Milkman: The summit was supposed to end at noon on Sunday, October 12th, 1986, but sensing the opportunity for progress, negotiations went into overtime. The parties decided to come back that afternoon to see if they could agree to something. At around 3 p.m., the American delegation had drafted a proposal for Reagan to take to Gorbachev.
Ken Adelman: We were very exhausted and really on edge because we wished Ronald Reagan good luck and he left the room. About 10 seconds, 15 seconds, 20 seconds later, the door opens up and back comes Ronald Reagan, and all of us are surprised. Then he says, "Hey, fellas," and he has this piece of paper in his hands. "I'm just wondering if we're sure that this is good for America." And he went around taking attendance of all of us, to see if there was any disagreement with what he was going to propose to Gorbachev. There being none, he thanked us all, and he went out. We again wished him the best of luck and thinking this was the key moment.
Katy Milkman: Unfortunately, Gorbachev did not accept the proposal. This could have been for several reasons, but the impasse was likely due in part to how each leader viewed key points in the document. For example, the U.S. had deployed intermediate-range missiles in Europe. Reagan saw this as a necessary counterbalance to missiles deployed in the Soviet Bloc countries. Reagan believed this didn't require negotiation and that the U.S. missiles zeroed out the Soviet arsenal, but Gorbachev viewed those U.S. missiles as "A gun pressed to our temple." You may also be familiar with Reagan's plan known as the Strategic Defense Initiative, SDI for short and also often referred to as Star Wars.
Ken Adelman: A very high-tech system protecting the country against incoming ballistic missiles.
Katy Milkman: Reagan saw this plan as purely defensive and key to securing peace. He said, "The point is that SDI makes the elimination of nuclear weapons possible." The problem for him was that an earlier anti-ballistic missile treaty prohibited the development of large-scale defensive systems. Gorbachev saw Reagan's Star Wars program as being anything but purely defensive. He believed the system could be used to develop offensive capabilities and start another arms race in space, so Gorbachev wanted to keep the anti-ballistic missile treaty and tighten it further. Gorbachev said to Reagan, "You are proposing to renounce it. We want to preserve it. You want to destroy it. We just don't understand this."
Ken Adelman: Gorbachev said, "Oh my God, if we're not going to get them to eliminate SDI, then we have to compete."
Katy Milkman: These are prime examples of how two people looking at the very same issues could have fundamentally different perspectives, and it was those seemingly intractable differences that scuttled the deal.
Ken Adelman: Gorbachev would not give in. He would not accede to the paper, very reasonable paper that Reagan had presented. Reagan was awfully mad. Reagan seldom, seldom got angry, but when he did, his Irish face would all get red, and he would get very disturbed. I could see him in the hallway with Gorbachev in Hofdi House. Gorbachev walked Reagan to the car, and Gorbachev was trying to make nice and said, "Mr. President, I don't know what I could have done different." Reagan was having none of that, poked his finger right in Gorbachev's chest, and he said, "Well, you could have said yes" and then got in his car and slammed the door.
Katy Milkman: Reagan's disappointment was on full display as he left the summit. They had come incredibly close to a significant arms reduction agreement. While Reykjavík was at first considered a bust, we now look back and understand this landmark meeting as historic. The breakthrough at the summit may not have resulted in an immediate arms deal, but it marked real progress in humanizing the two men to each other. They improved their understanding of each other's political and cultural contexts and laid the foundation for the Intermediate-Range Nuclear Forces Treaty just a year later in 1987.
Ronald Reagan: We've made this impossible vision a reality.
Katy Milkman: By May 1991, the nations had eliminated 2,692 missiles, followed by 10 years of on-site verification inspections.
Speaker 15: The signing of the first ever agreement eliminating nuclear weapons has a universal significance for mankind, both from the standpoint of …
Katy Milkman: It's a testament to the power of perspective-taking in diplomacy. Instead of each side seeing the other as fundamentally wrong and dangerous, they started to understand each other's core concerns. It was through this mutual understanding that they were able to take the first steps toward what seemed impossible, de-escalation and arms control. It was two men, each with their own perspectives and strongly held beliefs, who took the first steps to bridge a seemingly impossible divide, and it was a connection that endured long after each man was gone from power.
Ken Adelman: My wife and I were really honored to go to the funeral of Ronald Reagan in Washington, D.C., in 2004. It was a very big event, and we were very moved by it. What moved me the most happened the day after that. At the U.S. Capitol, under the dome, there was the coffin of Ronald Reagan. Unbeknownst to any of us, and I think unbeknownst to the FAA at that time, a private plane comes to the airport, now named Ronald Reagan Airport, and out comes Mikhail Gorbachev. He then goes to the Capitol. The guards recognize him. The guards let him in front of the roped off section, so he's a few feet from the coffin.
He stands there for a good period of time thinking over how much they had accomplished, how much they grew to like each other during all their negotiations, how much he grew to respect Ronald Reagan. And then Gorbachev approaches the casket, which is covered with the flag of the United States, and takes his fingers and goes back and forth on the stripes of the flag of the United States and looks at the casket. It was a very sad view. And that night, I remember the press asked him, "Boy, that was an amazing moment." And he said, "Well, that was an amazing person, and I felt so strongly about him. I just wanted to say goodbye."
Katy Milkman: Ken Adelman was U.S. ambassador to the United Nations before serving as arms control director for the Reagan administration. He's the author of Reagan at Reykjavik: Forty-Eight Hours That Ended the Cold War. You can find more details about Ken's work in the show notes and at schwab.com/podcast.
In a polarized world, it can sometimes feel impossible to bridge the divide between opposing views, even when people are operating in good faith. But the example of Reagan and Gorbachev demonstrates that putting yourself in someone else's shoes, attempting to understand the context and constraints and realities of the other side, can lead to understanding and compromise.
My next guest is an expert in a bias called naive realism, which helps explain why we have such a hard time accepting opinions that are different from our own. Julia Minson is an associate professor of public policy at the Harvard Kennedy School of Government. She's a decision scientist with research interests in conflict management, negotiations, and judgment and decision-making.
Hi, Julia. Welcome to Choiceology. I'm so glad to have you here today.
Julia Minson: It's great to be with you, Katy.
Katy Milkman: I'm really excited to talk about naive realism, which is a fascinating topic, and I was hoping we could just start with what it is.
Julia Minson: Sure. Naive realism is basically the belief that most of us walk around with most of the time, that our perceptions of the world and our reactions to the things we observe are reasonable and objective and basically unbiased. I see the world as it really is.
Katy Milkman: Julia, are you telling me that my perceptions of the world are not correct? Because I'm going to have trouble with that.
Julia Minson: Well, so it's really funny. So naive realism is an idea that actually comes out of philosophy, so we kind of stole it from the philosophers. We process all of our encounters with the physical world through a set of perceptual organs, and so what we experience in our brains is not actually what's out there. So when you and I look at a couch of a particular color, we're actually seeing slightly different colors because we have slightly different eyeballs, and so in a physical sense, we don't perceive reality as it really is. We perceive a version of it that's funneled to us through our organs. And then that's even more true when we think about our social reality. We think that we understand what's just or what's fair or what's reasonable in a particular setting, and we don't account for the filtering that happens through our background or our ideology or our level of education or whether we're tired or cranky or sleepy or had a glass of wine. We generally don't stop to consider how all of those factors influenced what we think of as the completely unmediated perception of just how the world is.
Katy Milkman: Could you talk about a few situations where you think it would be really natural to see someone exhibit naive realism? What are some of the contexts where this comes up?
Julia Minson: It is just the way we relate to information in the world, and it's not even necessarily a bad thing, right? If I am walking around the world assuming that I'm a rational, reasonable person, that saves me a lot of time, and most of the time, I am doing rational and reasonable things. Where naive realism tends to become more problematic is when we are dealing with more contested opinions and issues, areas where people could disagree. Ironically, the first thing that naive realism tends to do is make us not anticipate disagreement. One of the biases that is closely related to naive realism is the false consensus effect, this idea that because I'm a rational, reasonable person and I see how things really are, I expect other rational, reasonable people to agree with me. So people tend to overestimate the proportion of other people who share their tastes and preferences and attitudes, and that is thought to be related to naive realism.
But the bigger problem is when we see clear evidence of disagreement, so I'm going around the world expecting people to agree with me, but then when somebody says, "No, that's not actually how I see this." If we start considering the causes of that disagreement through a naive realism lens, it might lead us to say, "Well, I get it because I'm seeing the situation in an objective way, and you disagree with me, ergo, you are not being objective. You're not being a reasonable person in this situation." And then because humans are very good at making up stories for the events around them, you can spin up a whole story about what exactly is wrong with you that's making you be unreasonable in this setting.
Katy Milkman: OK. So if I were to extrapolate from what you just said, it sounds like anytime I'm interacting with someone who has a different background than I do, and we're talking about our opinions, say, on world events or on which stocks are good buys or on where to work or how to raise my kids or how they should raise their kids, these are all going to be situations where naive realism could cause conflict. Is that an accurate assessment?
Julia Minson: Yeah. You've basically just listed every situation where a human encounters another human, and I think that's exactly right.
Katy Milkman: OK. I want to get to the science. You've given us a really great tutorial on what naive realism is, but could you tell us about a favorite study demonstrating that people exhibit naive realism, or a classic way of measuring this?
Julia Minson: So this is actually one of my favorite studies that got me very excited about going to graduate school, and I participated in this research as a research assistant. So this was a study that was conducted by Lee Ross, who came up with the idea of naive realism in collaboration with Andrew Ward. When I was Lee's research assistant, he taught an applied social psychology course at Stanford, and so we had this large room of undergrads who were all in this class, and we asked them to fill out a survey about their opinions on a bunch of different policy issues that were at play at the time and you had to say whether you agree or disagree with a bunch of different things. And then we collected all these surveys, basically shuffled them around, and then passed out the surveys back to the students, except for you ended up with the survey of some other random student in the class. So you no longer have your own survey. You are looking at somebody else's responses.
And the questions before you is consider your own responses that you had just filled out and now the responses that you are looking at. How much do you think you overall agree or disagree with this other student's worldview? And so they get this seven-point scale of how much they agree or disagree, and then they have a bunch of questions about what is driving your own views versus this other person's views. And we give them sort of this set of considerations about normative facts, normative considerations that could influence your beliefs. So things like a good understanding of American history, engagement in world events, the desire for fairness and justice. Those are things that you would want to influence your own policy opinions. And then we give them a bunch of things that we thought of as biases, so political correctness, the self-interest of the group you belong to, reading partisan media.
And so what we find is that people generally attribute their own views to be more driven by normative considerations than biases, and then to the extent that their counterpart disagrees with them, this other student whose questionnaire they're looking at, to the extent that they see that questionnaire as reflecting a different view of the world, they think that that worldview is primarily driven by biases rather than normative considerations. And what's interesting is that it's not simply like, "I think that my views are normative and other people's views are biased." It's that other people's views are biased to the extent that they happen to disagree with mine. And so I thought that was just a really neat demonstration, and I still teach a version of it.
Katy Milkman: I want to talk about one more research project before we move on to talk a little bit about the implications of this, because you have a great paper from 2012 where you looked at how naive realism affects the way we update our beliefs to incorporate other people's opinions, and I was hoping you could tell us a bit about that work.
Julia Minson: Yes, absolutely. So quite a lot of the early work I did, again, with Lee Ross and Varda Liberman, who is a researcher in Israel, had to do with making quantitative estimates. The great thing about quantitative estimates is that there are lots and lots of them in the world, and there are correct answers, so you can actually measure what strategy would be more effective if you're going for accuracy. And sometimes in research, we use silly examples, jelly beans in jars, but in the real world, those silly examples often map on to profoundly important things, so how many people would support this particular policy or how many inches of snow will fall in Boston next year, and therefore, how many snow plows do we need to prepare? So we make quantitative estimates in the real world constantly that actually matter.
And the thing that's interesting is that we often make quantitative estimates with the input of other people. So I have some estimate about the world, and you might have some estimate about the world, and if I have invited you into a conversation about what the right answer is, the question that is interesting from a naive realism perspective is how much weight am I going to give to your estimate? If I made an estimate independently and so have you, I should average those two estimates.
What we find instead is that people pretty consistently give more weight to their own judgment than the other person's judgment, and in fact, they do to such an extent that it's tantamount to saying that I am twice as knowledgeable about this thing that I really have no idea about as you are. So we have people estimate all kinds of things that they know nothing about, and they very consistently give twice as much weight to their own judgment as they give to the other person's judgment. So we know mathematically averaging works better. The question is why don't people do it? An explanation that you would get from naive realism is that the further away you are from my estimate, the more I think that you're just wrong.
Katy Milkman: What's the central reason that you think naive realism exists in the human mind?
Julia Minson: I think naive realism exists because it's efficient. There are lots of decisions that we have to make very quickly and urgently, decisions that have to do with safety or decisions that have to do with deciding if a person is a friend or an enemy, and in those cases, it makes sense to act fast. Now, it's impossible to tell if, on balance, it's a good thing or a bad thing. What I do think is that because it has benefits, it's very, very hard to overcome. If we wanted to just stop being naive realists, we'd have to accept being extremely slow in getting anything accomplished.
Katy Milkman: No, that makes sense. Like many of the heuristics and biases we've covered on the show, these things are useful on average but can be problematic in certain situations, and we want to be on the lookout for them. Julia, I'm curious if there's anything that this research has led you to do differently in your work or in your life?
Julia Minson: Yeah, so absolutely. It's funny because I've been doing this work for a long time, and I've been married for a long time, and occasionally my husband will call me a naive realist right in our kitchen, and that's an unusual experience, but I do think that this has tremendous impact, especially for relationships where there's two people. So if there is a team or a group, when there's a majority of people who agree on something, that helps sort you out. But when there are exactly two people, both of them can endlessly point the finger at each other and say, "You're the naive realist." "No, you're the naive realist."
Katy Milkman: Let's be honest, most people aren't using that terminology. That's unique to your kitchen counter. They're saying, "You're wrong. No, you're wrong."
Julia Minson: That's right. And I think in reality, all of us are naive realists almost all of the time, and that's incredibly important to keep in mind, so what I like to tell my students when I teach is that this is professional advice. This is also marital advice. When you disagree with a smart person, you are wrong 50% of the time. That's just a very good heuristic to remember. Half the time, you are the one who is wrong. You would do well to remember that, especially in either marital arguments or business partnerships.
The other piece of advice would be to actively seek out people who you expect to disagree with you, because I think we have such a strong instinct against it that unless you really make it a point to look for those people and get their input, you will just either avoid it or you will discount it when you come across it.
Katy Milkman: That's a great piece of advice to close on, Julia, because I think most of us find it uncomfortable to disagree with other people, and too often, we'll seek out advice from someone who we know is like-minded, but I think this research highlights that that is not setting us up for success and not setting us up for making the best judgment. So I think it's a great place to wrap, and I really appreciate you taking the time to talk about this important topic with us. Thank you.
Julia Minson: Of course. Thank you so much.
Katy Milkman: Julia Minson is an associate professor of public policy at the Harvard Kennedy School of Government. You can find links to her work in the show notes and at schwab.com/podcast.
Whether you agree or disagree with what's happening in Washington, there's no question that government policy can have a profound impact on the markets and your finances. On Schwab's WashingtonWise podcast, host Mike Townsend focuses a nonpartisan eye on the stories that matter most to investors. Check it out at schwab.com/WashingtonWise, or just search for "WashingtonWise" in your podcast app.
Being aware of naive realism is as important as ever in a hyper-polarized world, and since we opened with a story about political disagreement, you're probably thinking politics is the arena where this bias is most relevant. But actually, it's an important bias to remember whenever you disagree with anyone about anything. When your boss has a completely different assessment of how to value and pay a new employee, when your partner feels differently about the right bid to make on a new house, or when your neighbor suggests a different likely cost for remedying a problem with your property, remember that you're a naive realist and so are they, and most likely the right answer lies somewhere in the middle, statistically speaking.
When it comes to making go/no-go decisions and not just estimates of value, naive realism matters too. There and elsewhere, it's important to recognize that your own instincts aren't the only ones you should factor in. Of course, this doesn't mean you should ignore expertise. When your doctor has one view on your medical condition and your plumber has another, you should probably listen to your doctor. But try adding a dose of humility when you find yourself in disagreement with someone who's just as well-informed about a topic as you are. Consider pausing and asking yourself, what if they're right? These small shifts will go a long way towards improving most of your decisions and taking the temperature down when there's no real need for things to get heated.
You've been listening to Choiceology, an original podcast from Charles Schwab. If you've enjoyed the show, we'd be really grateful if you'd leave us a review in Apple Podcasts, a rating on Spotify, or feedback wherever you listen. You can also follow us for free in your favorite podcasting app. And if you want more of the kinds of insights we bring you on Choiceology about how to improve your decisions, you can order my book, How to Change, or sign up for my monthly newsletter, Milkman Delivers, on Substack.
Next time, I'll speak with University of Delaware professor Jackie Silverman about the power of tracking streaks when we're trying to build habits and how you can avoid becoming demotivated if you break a streak. I'm Dr. Katy Milkman. Talk to you soon.
Speaker 16: For important disclosures, see the show notes or visit schwab.com/podcast.