KATY MILKMAN: A warm hat is a good idea on a cold winter day. That's obviously true, but you've probably heard that we lose most of our body heat through our heads. A U.S. Army survival manual from 1970 suggested that 40 to 45% of body heat is lost from the head. Maybe your parents and teachers told you something similar. The thing is, it's not accurate. We only lose about 7 to 10% of our body heat through our heads, which is roughly the surface area your head takes up. In short, any exposed part of your body will release heat equally. The idea that you lose a large portion of your body heat through your head is one of many supposed facts that we hear over and over again.
How about the idea that bats are blind? Nope. In fact, their vision may be better than human's vision during low-light conditions at dawn and dusk. Or this one: Eating sugar makes children hyperactive. It certainly seems that way. Every time I take my son to a birthday party and let him eat cupcakes, he goes wild. But several high-quality studies have failed to show any reliable changes in children's behavior following sugar intake. So if these statements are false or inaccurate, why are they so pervasive, and why do they feel so true?
In this episode, we look at one possible explanation, a phenomenon that can cause us to believe inaccurate information more than we should. And it can also lead us to trust reliable information less than we should.
I'm Dr. Katy Milkman, and this is Choiceology, an original podcast from Charles Schwab. It's a show about the psychology and economics behind our decisions. We bring you true and surprising stories, and then we examine how these stories connect to the latest research in behavioral science. We do it all to help you make better judgements and avoid costly mistakes.
JENNIFER LEMESURIER: A little bit of MSG in scrambled eggs is really good. I tend to do fusion cooking at home, so I don't necessarily use it in all of my dishes, but you definitely notice when you have both salt and MSG. So for example, if I'm doing a batch of chicken broth, I'll do a little salt and a little MSG at the end.
KATY: This is Jennifer.
JENNIFER: My name is Jennifer LeMesurier. I am an associate professor of writing and rhetoric at Colgate University.
KATY: We haven't invited Jennifer on the show for her expertise in rhetorical theory, though I suspect we could have a really interesting conversation on that topic, too. Instead, we've asked her here to talk about her expertise in food and culture, and her experience with a much-maligned cooking ingredient, monosodium glutamate, also known as MSG. She started looking into MSG on a whim.
JENNIFER: I was watching a TV show, and a scientist said very casually, "Oh yeah, the whole controversy over MSG started because of one letter to The New England Journal of Medicine in 1968."
KATY: That statement piqued Jennifer's interest. She wanted to know more.
JENNIFER: I went to the University of Washington Medical Library, and I'm wandering through the stacks. I tracked down the giant journal, and I found the letter, and it's Dr. Ho Man Kwok writing in and asking, "Hey, I've noticed this problem when I go to certain Chinese restaurants." It says, "To the editor: For several years since I've been in this country, I have experienced a strange syndrome whenever I have eaten out in a Chinese restaurant, especially one that served northern Chinese food. The syndrome, which usually begins 15 to 20 minutes after I have eaten the first dish, lasts for about two hours without any hangover effect. The most prominent symptoms are numbness at the back of the neck, gradually radiating to both arms and the back, general weakness, and palpitation. I have not heard of the syndrome until I received complaints of the same symptoms from Chinese friends of mine, both medical and non-medical people, but all well-educated."
And then he ruminates a little bit on the cause. He says, "After some discussion, my colleagues and I at first speculated that it might be caused by some ingredients in the soy sauce to which quite a few people are allergic. Some have suggested that these symptoms may be caused by cooking wine. Others have suggested that it may be caused by the monosodium glutamate seasoning used to a great extent for seasoning in Chinese restaurants. Another alternative is that the high sodium content of the Chinese food. …" Monosodium glutamate is only one possible cause that he lists.
KATY: Dr. Kwok's letter was an invitation to fellow medical professionals to dig into why some people seem to have unpleasant symptoms after eating Chinese food. But the letter didn't generate the kind of reaction he was expecting.
JENNIFER: I wanted to see what doctors said in response. So I kept looking at future issues, and it just sort of unfolded into the saga of doctors calling him essentially, in more professional terms, but calling him stupid and saying like, "What are you talking about? MSG? That's the stupidest thing." Several doctors wrote satirical responses. One medical student wrote a sonnet making fun of the idea that MSG could be harmful to one's health. And then I was curious to see, OK, we have this group full of doctors making fun of this idea. How in the world did it become a panic?
KATY: The medical community wasn't convinced that MSG was a problem, but if you're over a certain age, you might remember friends and family worrying about MSG, particularly in American Chinese food. You might have heard that it caused headaches or even migraines, that it was bad for your health, even in small quantities. It turns out that this common notion grew out of the media response to Dr. Kwok's letter. News outlets picked up on his message and misinterpreted the medical community's skeptical response.
JENNIFER: I started looking at newspapers that had picked up this story, and what I found was so interesting is they were quoting the satire, and they were quoting the poetry, and they were quoting the mockery as serious. And they were saying, "Hey, these doctors are legitimately scared of this substance. We need to be scared about this substance." It was this weird game of telephone that spiraled into the public consciousness.
KATY: Let's step back for a moment and talk about what MSG, or monosodium glutamate, actually is. You're probably familiar with it as a flavor-enhancing ingredient in some Chinese food, but you might be surprised to learn that it's also naturally occurring in many foods from many different cultures.
JENNIFER: MSG is an amino acid. It can be manufactured scientifically, but it's also in lots of different foods that we eat all the time. It's in sun dried tomatoes. It's in Parmesan cheese. It's a dominant presence in a lot of Italian foods, a lot of processed meats. If you look at pepperoni, pepperoni is an umami-MSG bomb. And again, that's not bad. Just sort of the composition of the ingredients.
KATY: MSG has a fascinating history.
JENNIFER: It was isolated by a chemist back in 1908, Ikeda Kikunae, in Japan, and he was looking for something that would help people with nutrition. He was thinking, "OK, it's not just about flavor. It's about how do we help the citizens of Japan develop a more nutritious diet?" And so he isolated this substance, monosodium glutamate, and he was the one who originated the term "umami" as a label for the savory, depth, richness of flavor that you get when you add monosodium glutamate to a lot of foods.
KATY: MSG was initially produced by extracting the amino acid from seaweed broth. Later it was produced by fermenting starch, sugar beets, sugar cane, or molasses.
JENNIFER: And then through various twists and turns, post-World-War II, it made its way to the U.S., and it was used in a lot of Chinese restaurants.
KATY: But manufactured MSG didn't start off in Chinese restaurants. It first came to the U.S. in food served to members of the military.
JENNIFER: The military is one of the driving forces of food industrialization. If you think about something like canned soup, MSG is an easy way to smuggle in a little extra bit of flavor. And then if you think about the Army MREs, the ready-to-eat meals. Those are notoriously not great food. And so if you have an ingredient that can amplify a little bit of the savoriness and give it a little bit more flavor, you can see how that's an ingredient that would quickly get picked up by those sorts of companies and industries.
KATY: Interestingly, the general public didn't give much thought to MSG until after the publication of Dr. Kwok's letter.
JENNIFER: There was a long period of time where MSG was in household cupboards and on people's plates. I'm going to skip ahead a little bit to the 1960s, and all of a sudden, people were aware of the harm of pesticides and the fact that what you add to crops can eventually enter your body. Also, a scare around this time about artificial sweeteners. So it wasn't until the cultural ground was primed in the 1960s that people really started to be more aware of things like, "Oh, this has a very suspicious chemical sounding name: monosodium glutamate. Do I want to be putting that in my body?" And I think that's a fair question to ask, but it very quickly tipped into this panic that we see in the newspaper coverage from the time.
KATY: About a year after the Dr. Kwok letter was published, journalists began to pick up on the story. We asked Jennifer to read some of the headlines from the time. A warning: Many of these headlines are xenophobic or outright racist. One unfounded claim invokes former Chinese Chairman Mao Tse-tung.
JENNIFER: For example, later that year in The Washington Post, an article came out and the title was "Chinese Food Jinx Is Identified." A number of headlines used Mao even though he had absolutely nothing to do with it. A Washington Post article in 1968 was called "Mao's Thoughts on Wonton: Bad Elements in China are Blamed for Tainted Soup." In an article on the LA Times, it was called "The Chinese Mystery." Probably the worst, most overt, racist headline was from the Chicago Tribune, where they used broken English to say, "Chinese Food Make You Crazy? MSG Is Number One Suspect."
So there were a lot of articles that even if the headline was somewhat more neutral, they would very quickly pun on wontons or talk about chop suey. It was very packaged in this stereotypical, othered, you-can-hear-the-accent-as-you-read-it sort of way.
KATY: These headlines fueled a panic about Chinese food in America.
JENNIFER: Chinese restaurant owners who are of Chinese descent have always had a hard row to hoe here in America. If you look at the history of Chinese food, they had to cater to American taste very early on. Chinese cooks who were from China had to pretty severely alter their recipes for American tastes. They had to add a lot more sugar. They had to really reduce the spice. They had to up the meat content and reduce the amount of vegetables according to the American palate. So MSG and the fear of it is just one more layer in terms of what Chinese restaurant owners have to be aware of.
KATY: And the fallout from this panic about MSG still reverberates today. I don't know about you, but I've certainly experienced this. I remember a recent Chinese lunch I went to with a fellow professor who politely asked our waiter to make sure that the kitchen held the MSG from all of his dishes.
JENNIFER: Even today, you go into Chinese restaurants, and a lot of them will say "no MSG" on their menus. And if they don't, there's often a paragraph on the menu that explains why. And the fact that people have to pre-emptively try and educate their consumer if they choose to use this one ingredient, that shows how deep the fear still is in the public consciousness. Beyond Chinese restaurants. I've also seen it at pizza places. You go to the supermarket, so many items in the health food section will say, no gluten, no dairy, no MSG. It's still one of those ingredients that if you're concerned about health, you're probably concerned about MSG, even if you don't really know why.
KATY: That's the interesting thing. Despite MSG's supposed negative health effects being largely debunked over the years, the idea of MSG as harmful has persisted. One of the reasons it persists is that early rumors of its dangers were repeated widely and often by the media and by word of mouth. I remember hearing from friends and family as a kid that MSG was no good. You probably did, too. Through that repetition, the MSG myth came to be accepted as MSG truth.
Of course, debunking the claim that MSG isn't bad for you is not the same as saying it's good for you. Like any other salty compound, it's surely not healthy to eat too much of it. But this particular amino acid is not the health hazard it was purported to be for many years.
JENNIFER: Culturally, it's a lot more accepted now. I think a lot more chefs have MSG shakers in their pantry. And as far as I'm aware, there's not an uptick of people with random spells of dizziness or nausea or numbness at the back of their head.
KATY: The FDA considers MSG to be safe to eat and has since the 1990s. But the myth of its dangers is so persistent that there are still ongoing efforts to counter false information about MSG and to remove the stigma surrounding its use.
JENNIFER: A lot of the reason that people now are starting to realize that MSG is OK is because of explicit activism. For example, the chef David Chang is well-known for on his TV show, on his Instagram account, talking about how MSG is just an ingredient. It's like any other ingredient. He's got an episode in his show Mind of a Chef where he goes and he buys the giant gallon tub of it. And was like, "Yeah, this is great. I use this all the time."
KATY: Jennifer's main takeaway from her explorations of MSG and fad diets and nutrition is that a lot of Americans fear food without scientific justification.
JENNIFER: I understand. I'm a '90s kid. I grew up in the time when E. coli was big, and you really had to cook your burger until it was dark, dark brown, or you were going to die. So I understand, and I think we should of course be smart and be cautious and think about food safety. But I also think that a lot of the myths that we believe are the ones that are most repeated that we don't question. And for me, it's really sad to think about the ways in which that makes us scared of our food, scared of something that should be about taste. It should be about pleasure. It should be about community. It should be about culture.
And instead, for a lot of people, it turns into restriction. And no, you can't have that, and only those people eat that sort of food. I really hope that as people reconsider their stance on something like MSG or Sweet'n Low or something else that you've been told about a hundred times that's bad for you, I hope that people reconsider that, but also that you also take the opportunity to look for more joy in your food and on your plate.
KATY: Jennifer LeMesurier is an associate professor of writing rhetoric at Colgate University in Hamilton, New York. She's also the author of the book Inscrutable Eating. You can find links in the show notes and at schwab.com/podcast.
The story of MSG illustrates a flaw in the way we judge the trustworthiness of information. To explain that flaw, I want to take you back in time to 1977, when Lynn Hasher, David Goldstein, and Thomas Toppino brought a group of college students into a laboratory on three separate occasions, with each visit made two weeks after the last. During these sessions in the researchers' laboratory, participants were tasked with reading a list of statements, some true, some false. And rating their perception of each statement's accuracy. Some of the statements about things like politics, sports, and the arts appeared repeatedly across the sessions.
Other statements about the same types of topics weren't repeated. Interestingly, across the three sessions, participants progressively rated both true and false statements that were repeated as more accurate than statements that appeared just once. This was the first study to scientifically showcase the influential role of repetition in shaping our beliefs about the validity of information. The researchers dubbed this phenomenon the "illusory truth" effect. It remains a hot topic of study today.
A surge of interest in the illusory truth effect coincided with the rise of social media. More recent research by my next guest shows that the illusory truth effect plays a pivotal role in people's judgments of blatantly false stories circulating online.
Tali Sharot is a professor of cognitive neuroscience at University College London and an affiliated professor in MIT's Department of Brain and Cognitive Sciences.
Hi, Tali. Thank you so much for taking the time to talk to me today.
TALI SHAROT: Thanks, Katy. It's great to be here.
KATY: First, I was hoping you could just describe for us what the illusory truth effect is.
TALI: Yeah. So the effect is basically if you hear something repeatedly, you're more likely to believe it regardless if it's true or if it's false. That's one reason that we believe things that are not true. A lot of people believe that you use 10% of your brain or that vitamin C can prevent the common cold. So the more you hear it, the more you believe it.
KATY: Could you describe your favorite classic study that illustrates the illusory truth effect? Is there a favorite research study you have that shows this happens?
TALI: Yeah. So I think maybe the first study is the best to illustrate it. So the first study was published in 1977. They gave people 60 different statements. Some of them were true, and some of them were not, and 20 of them were repeated across three weeks, and the others were not repeated. What they found is that those of those repeated at least once were more likely to be believed.
So that's the first demonstration. But since then, this illusory truth effect has been shown so many, so many times. So it's definitely something that is easy to replicate. It's been shown in different populations, different gaps of time between the first time and the second time you hear it. So it's a very, very strong effect.
KATY: What do we know about why this arises? Why is it that when I see the same statement multiple times, I am more likely to believe it's true?
TALI: So there's probably a few reasons. So one reason is that when you hear something again and again, you brain processes it less and less. It makes sense. The first time you hear something absolutely new, like for example, a shrimp's gut is in its head. So your brain has a lot to process. It maybe comes up with an image of a shrimp or the last time you ate a shrimp. So it does a lot of processing.
But the second time I tell you a shrimp's gut is in its head, then the brain doesn't process it as much. So less and less processing. And what happens is when we process something less, it feels more familiar, right? It's less surprising. And we're used to familiar things being true. And so the result is that you're more likely to accept something, and the more familiar it is, the truthier it is. We don't stop to think about it. It's just like we've heard it before, the brain doesn't respond, and we just accept it as it is.
Now, a related reason is that we're quite good at remembering what we've heard before, but not necessarily where we've heard it. So you can hear statements from a lot of different places, and some of the sources can be quite dodgy, but you don't necessarily remember what the source was. So it's harder for you to figure out, "Well, I've heard this before, but was it a reliable source, or wasn't it a reliable source?"
KATY: It's so fascinating and fits into the heuristics and biases literature because it seems like familiar things probably are more likely true than new things, on average. It's just that this is going to lead us astray in situations, particularly where someone is deliberately trying to mislead us or where there's an often-repeated rumor that isn't true. I would love if you could also tell us a little bit about your recent research suggesting that the illusory truth effect can drive the spread of inaccurate information.
TALI: So it's been shown so many times that things that you've heard more than once, you believe more. And so we thought, "Well, if you believe it more, are you going to share it more?" Because it turns out there's a lot of research that shows that people actually want to share true information. Yes, people sometimes go online to share misinformation. There's all these mal-agents. But most of us, we want to share true information. So we thought, "Well, let's see if we show people statements more than once, not only will they believe it more, they would want to share it more."
So we did something quite simple. We showed people 60 different statements. We actually did two experiments. One experiment was all health statements. The other experiment was really a combination of statements from history and biology and so on. Half of the statements were repeated, and half were not repeated.
KATY: How were they repeated? Was it just in the course of taking a survey experiment, or were they repeated over the course of time?
TALI: It was all in a short amount of time. It was all within one hour.
KATY: Even more interesting.
TALI: So you see 30 statements once, and then you see the whole 60 statements, one-at-a-time. And for each statement, we ask two questions, and the order was random. Either we ask you how accurate it is first, or we ask you, "Do you want to share it?" Sometimes we said, "Do you want to share it with the subjects that are coming later on?" Sometimes we ask, "Do you want to share it on your Twitter account? Imagine that you have a Twitter account that's about general information. Would you like to share it?" And we found that statements that were repeated more than once, people believed them more, and they wanted to share it more. We did what's known as a mediation analysis, and that suggested that the reason that people share it more is because they believe it's more accurate.
KATY: That's really interesting. I'm curious what gave you the idea to do this research? You mentioned the concerns about what we share and accuracy, but what made you think the illusory truth effect would be an interesting thing to study in this way?
TALI: I guess I think a lot about misinformation and misinformation spread and why we believe things that are untrue. And especially now with social media platforms, you have so much misinformation, and that can cause a lot of really bad things. Polarization and believing health information that's not true. And so the idea that it's just that you see things more than once makes you share it more suggests that there's this vicious cycle. What we think is happening is some misinformation is actually constructed to grab your attention, make you want to share it more. Maybe it enhances emotion and so on. OK, so people share it to a certain extent, and then the user sees this statement, this piece of misinformation more than once, a few times. So that makes them believe it and want to share it more. And so they share it more. And then the other users see it more and more and more.
So it's highlighting this mechanism that's probably happening on social media platforms that is causing us to believe things that are not true. It also suggests that we really need to try to address statements that are not true as fast as possible. I think in this case, it's all about stopping misinformation from being repeated if it's online or a social media platform. So that really goes into the greater question of how do we address misinformation and fake news and so on online?
There's a lot of different solutions that people have suggested. I think to my mind, there's two perhaps ways that I think would be helpful. And a lot of the work is about the user. How do we educate the user to figure out what's true or false? And that's very hard, and I don't think it has to be up to the user to a large extent. It has to be up to the platform. And of course, that's also difficult. How does a platform figure out what's true and false? But it's not impossible to at least reduce, right?
I'm not saying you're not going to have misinformation at all. So that's one thing. And I think the other thing is to enhance the likelihood that misinformation won't be shared and will be highlighted by the users themselves, by incentivizing users to want to share true and want to avoid sharing false. So one kind of solution that we have suggested and tested is relatively simple. It's not going to solve everything, but it helps, which is to add two buttons, a trust button and a distrust button.
The reason that we think something like a trust button will help is because a like button is by definition unrelated to accuracy. That's just what I like, and I can like something that's not true, and that's perfectly fine. But a trust button by definition is about veracity. Do I trust this? Do I believe it's true? So what we found is that if you add these buttons, a few things happen. One is people actually use them to discern true from false information more so than any other buttons like retweet and like. And number two, once that happens, users, they get used to this idea that, "I'm in a new platform where I can get these carrots in the form of trust if I put in true information. I can also get sticks in the form of distrust if I put in misinformation."
So they start sharing more true than false information, and they also learn slowly, slowly what's true and false from other people's feedback. So we have to think about how can we change the incentive structure of the social media platform to enhance accurate information out there and reduce misinformation.
KATY: That's really interesting. Do you have any advice for our listeners about how they can improve their everyday decisions to try to avoid making mistakes related to the illusory truth effect now that they're aware of it?
TALI: Yeah. I want to say first of all that it's very, very hard. And this is one of the reasons that I'm saying it has to be more of like a platform policy. The best way is check the source. And then if something feels a little bit suspicious, check it out before, for example, you retweet. If it's not important, it's not important. But if you're going to act based on the information, it's good to take a few seconds to just Google it, see. Do you get hits from reputable sources?
KATY: That's great. No, I love that. Basically, what you're saying is once we start to understand that there are biases that can contort how well we assess the truthfulness of information, that should make us more skeptical of any information that we might instinctively want to believe. And so if we can just be a bit more skeptical generally, that's going to be protective, not just against the illusory truth effect, but probably against a whole range of biases in the way we judge what's accurate.
TALI: I think there's a key word that you said, which is "things that we want to believe." I think that's kind of the problem that we're seeing now because I think nowadays people are so aware that there's misinformation online that is so easy for us to anything that we don't want to believe, we say, "No, that's not true." And that's a whole other problem as well because anything that we feel, "Oh, I don't want that to be true, it doesn't fit with my ideology, it doesn't fit with my group," it's so easy to say, "Well, you never know. I never know if this video is true. I never know if this photograph is true." And that's another problem in the last few years, people have focused a lot on believing things that are not true, and I think there's also the problem of not believing things that are true.
KATY: Such a good point. Maybe that's a really nice place to stop is with another problem for another episode. Thank you for the really interesting research you've done and for telling us about the illusory truth effect, which of course studied it as you said in the 1970s, but more relevant than ever in the 2020s. So really appreciate you taking the time to talk with us and the work you've done on this topic.
TALI: Thanks for having me. Thanks for the questions.
KATY: Tali Sharot is a professor of cognitive neuroscience at University College London and an affiliated professor in MIT's Department of Brain and Cognitive Sciences. She's also the author of several books, including a terrific recently released book with past Choiceology guest and behavioral science superstar Cass Sunstein, titled Look Again: The Power of Noticing What Was Always There. You can find links to her books and her research in the show notes and at schwab.com/podcast.
One realm in which often-repeated maxims can take on the illusion of absolute truth is stock trading. Think "The trend is your friend" or "Don't catch a falling knife." Of course, no rule of thumb works all the time, and a lot of these sayings even contradict one another. On a recent episode of the Financial Decoder podcast titled "Should You Trust Popular Trading Proverbs?" Mark Riepe and his guest, Nate Peterson, unpack several trading cliches to understand the kernels of truth that they offer and how the reality of executing trading strategies is much more nuanced. Check it out at schwab.com/financialdecoder or wherever you get your podcasts.
If you're choosing between two stocks, and you decide you'd rather invest in the first company because you've heard good things about it so often, you might be making a mistake. Some companies are more often discussed simply because they're consumer-facing or more newsworthy, but they aren't necessarily better investments than their less-discussed peers. The thing is, you might start to believe they're unusually good investments thanks to the illusory truth effect, but that would be a mistake.
The same risk applies to often-repeated financial advice. Maybe you've heard this one: To ensure a comfortable retirement, just stop buying coffee out and make it at home instead, then set aside those savings. The trouble is, if you took that advice and thought that your financial future was secure, you'd be up a creek. In short, the illusory truth effect teaches us that when we hear things over and over again, we accept them more readily, and that's a mistake.
So the next time you make an important decision on the basis of any piece of information, it's wise to remember how easily we can all be swayed into believing something false is true and do a little extra fact-checking. Maybe you're considering buying a house in a neighborhood because you've heard it has great schools. It's probably worth researching the school district rather than assuming the information you've heard from a few friends is fully accurate.
Or maybe you're thinking of trying a new health product that you've heard is terrific. Doing a little research on whether it's actually evidence-based couldn't hurt. Because our minds can struggle to sort out truths from falsehoods, fact-checking your assumptions when making important decisions often has big payoffs.
You've been listening to Choiceology, an original podcast from Charles Schwab. If you've enjoyed the show, we'd be really grateful if you'd leave us a review on Apple Podcasts, a rating on Spotify, or feedback wherever you listen. You can also follow us for free in your favorite podcasting app. And if you want more of the kinds of insights we bring you on Choiceology about how to improve your decisions, you can order my book, How to Change, or sign up for my monthly newsletter, Milkman Delivers, on Substack.
Next time you'll hear about why October surprises can be so pivotal in presidential campaigns and how a doctor's experience with their last patient can bias the treatment you receive. I'm Dr. Katy Milkman. Talk to you soon.
SPEAKER 4: For important disclosures, see the show notes or visit schwab.com/podcast.