Data for the Win: With Guests Michael Kist & Cade Massey

February 4, 2019
Where analytical models and algorithms outperform human judgment, it's still so tempting to just go with your gut.

Listen on Apple Podcasts, Google Podcasts, Spotify or copy to your RSS reader.

Transcript Open new window

After you listen

When it comes to financial decisions, trusting your intuition could lead to mistakes or missed opportunities.

When it comes to financial decisions, trusting your intuition could lead to mistakes or missed opportunities.

Netflix recommendations, Amazon suggestions, Google searches, airline ticket prices, your social media feed. All of these things are driven by algorithms—computer models that crunch massive amounts of data to generate useful results. These types of online algorithms are commonplace and so, generally speaking, we're used to them.

But what about the algorithms behind self-driving cars or airplane autopilots? What about algorithms used to predict crimes or to diagnose medical conditions? These are domains in which it often feels uncomfortable to let a computer model make what could be life or death decisions.

In this episode of Choiceology with Katy Milkman, we're exploring the places where algorithms and computer models bump up against resistance from their human users.

  • Seeing as it's Super Bowl season, it seemed like a good time to revisit last year's contest as a case study in decision making. The 2018 Super Bowl champion Philadelphia Eagles played incredibly well against the formidable New England Patriots. The game could have gone either way, but the Eagles had a secret weapon that gave them an advantage. We speak with Michael Kist from Bleeding Green Nation on the Eagles' integration of computer models for decision making both on and off the field. You'll hear the story of how those models were temporarily abandoned and the team struggled before re-embracing them.
  • Next, we explore the way self-driving cars make split-second decisions on the road, with results that can make their human passengers squirm. We test whether or not giving people a small amount of control over how a self-driving car behaves gives those people a bit more confidence about the technology.
  • Then Katy speaks with her Wharton School of Business colleague Cade Massey, who explains some of the fascinating ways that algorithms have improved decision making and looks at some of the scenarios where algorithms face an uphill battle for acceptance. Cade Massey is a partner in Massey-Peabody Analytics.
  • Finally, Katy recaps the ways that people designing—or simply using—algorithms can work to overcome our human tendency toward machine mistrust.

Choiceology is an original podcast from Charles Schwab.

If you enjoy the show, please leave a rating or review on Apple Podcasts.

Learn more about behavioral finance. 

Common Trading Mistakes to Avoid

For new market traders, review these common trading mistakes so you can avoid emotional blunders with your investments and take advantage of psychological edges.

Mary Anne's Story: How Can You Leave a Legacy and Honor a Loved One?

How did one woman honor her twin sister through a charitable gift?

Should You Start Giving Money to Your Heirs Now or Leave a Bequest?

Giving away assets in one's lifetime might make sense tax-wise, but the issue is deeply emotional and personal for most people.

All expressions of opinion are subject to change without notice in reaction to shifting market conditions.

The comments, views, and opinions expressed in the presentation are those of the speakers and do not necessarily represent the views of Charles Schwab.

Data contained herein from third-party providers is obtained from what are considered reliable sources. However, its accuracy, completeness or reliability cannot be guaranteed.

Apple Podcasts and the Apple logo are trademarks of Apple Inc., registered in the U.S. and other countries.

Google Podcasts and the Google Podcasts logo are trademarks of Google LLC.

Spotify and the Spotify logo are registered trademarks of Spotify AB.

0219-8M84