How would you feel if during your trip to Agra, India, someone offers to sell you the Taj Mahal (one of the greatest wonders of the world)? In its absurd glory you might question the offer as you read this, but not everyone did. One of the greatest con artists from India, Natwarlal aka Mithilesh Kumar Srivastava, not only sold the Taj Mahal to unsuspecting foreign tourists, but also made out selling a few other historical monuments ~ and not just once, but multiple times. No one likes to be conned, fooled or taken advantage of. However, everyone has fallen prey to someone else’s deceit at least once because we are wired for trust-bias and scammers use our own cognitive beliefs, habits, and assumptions against us.
On this episode, research neuroscientists, co-authors and collaborators Dr. Christopher Chabris and Dr. Daniel Simons discuss their most recent book, “Nobody’s Fool: Why we get taken and what we can do about it” and explain what classic and current research in cognitive psychology and the social sciences says about our vulnerabilities to fall prey to deception and fraud. Focus, critical thinking, discernment and questioning ourselves with criticality are some of the effective ways of managing our truth bias and activating our executive function to protect our future-selves.
About Dr. Christopher Chabris
Christopher Chabris is a Professor at Geisinger, a Pennsylvania healthcare system, where he co-directs the Behavioral Insights Team. He previously taught at Union College and Harvard University, and is a Fellow of the Association for Psychological Science. Chris received his Ph.D. in psychology and A.B. in computer science from Harvard. His research focuses on decision-making, attention, intelligence, and behavior genetics. His work has been published in leading journals including Science, Nature, PNAS, and Perception. Chris is also a chess master and co-author of the bestseller The Invisible Gorilla (published in 21 languages) and the forthcoming book on deception and fraud, Nobody’s Fool.
About Dr. Daniel Simons
Dr. Daniel Simons is a professor of psychology at the University of Illinois where he heads the Visual Cognition Laboratory and has courtesy appointments in the Charles H. Sandage Department of Advertising and the Gies College of Business. Dan received his B.A. from Carleton College and his Ph.D. from Cornell University. His research explores the limits of awareness and memory, the reasons why we often are unaware of those limits, and the implications of such limits for our personal and professional lives. For more information, visit dansimons.com.
About Host, Sucheta Kamath
Sucheta Kamath, is an award-winning speech-language pathologist, a TEDx speaker, a celebrated community leader, and the founder and CEO of ExQ®. As an EdTech entrepreneur, Sucheta has designed ExQ's personalized digital learning curriculum/tool that empowers middle and high school students to develop self-awareness and strategic thinking skills through the mastery of Executive Function and social-emotional competence.
Sucheta Kamath: Welcome back to Full PreFrontal exposing the mysteries of executive function. I'm your host Sucheta Kamath. And I believe by tying the findings from neuroscience, psychology, and education into everyday transformations, a lot can happen to our personal and collective growth. This podcast is of course fueled by three goals. One is to explain what executive function is how it's crucial for our personal development, interpersonal relationships, self sufficiency, but also its implication for moral development. Second is to help motivate our current self to investigate our blind spots, and find ways to grow and change particularly consult the experts, particularly like our guests who have spent their lifetime in bringing some advances that may open our eyes to possibilities. And lastly, to really help people create a playbook for personal success. And because of the framing of executive function is these are malleable skills and they grow and develop exponentially with exposure, experienced practice and expert feedback. Today's topic is about gullibility is at least how I would say and I would say no one likes to be conned, fooled, or taken advantage of however, many of us fall prey to someone else's attempt to get his con fooled and so that they can take advantage of us. And in preparation for this podcast, I made a list of again, in publicly I'm willing to admit kind of situations where I was taken advantage of an easily could list 10 situations. I'll begin with one where we went, this was my first time in us and I saw a catalog on nothing. This was in 1993. And I had never seen a catalog where you could order things. And these were Christmas ornaments. Now you have to mind my context, I came from India never had seen a Christmas tree never had a Christmas tree did not know what ornaments were. And the ornaments were a full page. beautiful handcrafted and the description was eloquent. And so I spent $40 that I did not have, which was a lot considering $500 was my rent and my scholarship. The ornaments came, and I'm imagining they go on a real tree. So I'm imagining, and then the ornament came in one tiny square box, the box that fits in palm of my hands. And so they were miniatures. Nowhere, they had written the word miniature there. And maybe it was implicit. And I was a fool to not understand this language. Maybe I was a true foreigner. But I was devastated. And of course, this is, you know, no idea how to return things or understand anything about how business is done. But that that experience did not teach me anything is what I'm trying to say. So with that, I would love to welcome two of my prolific, incredibly celebrated neuroscientists and psychologists who have been rocking the field and bringing information to us. So it's with great pleasure and honor, I would like to introduce our first guest, Dr. Daniel Simon's who is a professor of psychology at University of Illinois, where he heads the visual cognition laboratory and his has courtesy appointment in the Charles H Sandage. department of advertising and the Gies College of Business. Dan received his BA from Carleton College and his PhD from Cornell University. His research explores the limits of awareness and memory, and the reasons why we often are unaware of those limits and the implications of such limits for our personal and professional lives. And second, he is our return a guest who has been on a podcast and in 2019 and great pleasure to have you back. Dr. Christopher Chabris is a professor at the Geisinger, a Pennsylvania healthcare system where he co directs the Behavioural Insights Team. His previous he previously taught at the Union College and Harvard University and is a fellow of the Association of psychological science. Chris received his PhD in psychology and a BA in computer science from Harvard. His research focuses on decision making attention, intelligence, and behavior genetics. His work has been published in leading journals including science nature. PNAS, P N A S. It's an acronym. I hope I'm getting all these pronunciations, right? Because I'm terrible at it. So forgive me an perception. Chris is also a chess master and a co author of the bestseller, The Invisible Gorilla, which he co wrote with Dan. And we are having them both together, because they are launching there as we speak. So their new book, Nobody's Fool. And the full name of the book is Nobody's Fool: Why We Get Taken In and What We Can Do About It. So welcome to the podcast. How are you today?
Daniel Simons: Good. Thanks for having us on.
Christopher Chabris: Yeah, great. Great. Great to be back.
Sucheta Kamath: Wonderful. So yes, since you're Invisible Gorilla book, and that also was a shocker to my system. But thanks for making me aware of my blind spots. So to begin with, I was wondering, since the title of your book is Nobody's Fool. Can you talk a little bit about the origin of this phrase? Nobody's Fool.
Daniel Simons: You want to go, Chris? Sure.
Christopher Chabris: Well, the title actually was, I believe, suggested by the publisher, we had been working with a bunch of different ideas for titles. And the phrase, nobody's fool refers to, you know, a person who can't be tripped, who can't be conned, who can't be deceived. So unfortunately, we can't promise that at the end of reading our book, you will never be conned or tricked or deceived again, but at least you'll be better at avoiding those kinds of things. It's a, I think it's a good phrase to sort of sum up the the idea, like you want to aim to be Nobody's Fool, but it's like a lifelong process of, you know, getting better at recognizing the signs and understanding of deception works, and so on in order to get there. So it's an aspirational. It's an aspirational, you know, title for our readers and for ourselves, too.
Sucheta Kamath: And love that. Dan, what do you want to add something?
Daniel Simons: No, I think that pretty much covers it. You know, there's obviously other famous books with the same title. There's a in the publishing industry, titles aren't copyrighted. So, you know, there's, there's a famous book by Richard Russo by that name, which had a very different theme to it. And they're much earlier works that that are titled Nobody's Fool, including a few movies. But, and as we talked about in the book, making reference to things that are familiar to people can sometimes make them find them more likeable.
Sucheta Kamath: That's great. So, so big to begin with, I think the premises, why we get taken in and what can we do about it? So can we pause and talk a little bit about our trusting nature? I feel like by being a human being, is interesting, a moral quality? Isn't this something that aligns us to become more socially appropriate or socially adept, or forces us or invites us to collaborate with the world? So why do we trust and then why some people break the trust?
Daniel Simons: Well, I mean, one of the central themes in our book is that almost all forms of deception depend on our assumption that other people are being truthful, right, what we call a truth bias that and if we didn't have that we just couldn't function in the world. We couldn't interact with other people. We just we couldn't work as a society, because we have to rely on the fact that other people are generally being truthful, right, that we should trust what they have to say. And the problem comes in that sometimes we assume that people were telling the truth when they aren't, right. But the vast majority of the time, people were typically being honest with us. They're telling us what they think, what they mean, what they believe what they're going to do, and we should trust them, right? So we can't go through life constantly. Second, guessing everything everybody says and does, it would be unproductive, it would be counterproductive. Whereas if we rely on people to be truthful most of the time, then most of the time, we're going to be fine. And the key for the key thing that we talked about throughout the book is recognizing when we should maybe ask a few more questions when we should maybe be a little bit more critical. And we try and identify what are the signs that you might be in a situation where that should happen, right where you should be doubting the truth of what you're hearing and remaining uncertain a little bit longer.
Sucheta Kamath: So one interesting thing about that is I think that this truth bias or trusting others as a default state mental state allows us to actually propel our mutual life goals, otherwise we will be full of cynicism and doubt and paranoid in a way right and that's no way of conducting business. And I loved one of the lines. In your summary, you had said that, you know, I mean, yeah, we can really keep everybody on track, and then we will kind of shrink our influence everything about the way we work through life will shrink. So then is it that people deceive? Or lie? Because, I mean, it's so easy to make, you know, benefit from somebody. Right? So why why do we know enough about why people lie?
Christopher Chabris: Well, I have to say that our book is not really about why con artists and cheaters and liars do what they do, right, we sort of assumed that those people are out there, that, you know, they're trying to they're trying to trick us somehow, they're trying to deceive us. And we really explore why it is that we fall into it. However, I think it is relevant to say that one of the reasons why they do it is because they can get away with it. And one of the reasons they can get away with it is because you know, the design of our cognitive system, starting with the truth bias, as Dan discussed, but then also, you know, some of the other cognitive habits we talked about in the book, and some of the other sort of features of the informational environment that hook us that make up the drawers and then make us pay attention and so on. You know, it's, it's turns out to be sort of, oh, I don't want to say shockingly easy, but, you know, somewhat easy to take advantage of those things and create, you know, a pathway in to someone, you know, to taking advantage of someone, we don't really get into sort of questions of, you know, sociopathy mental illness, you know, the morality of cheating, you know, very much a except also to notice that there are some signs that it's on the rise in recent years. And that could be because technology makes it easier, there could be lots of reasons for that. We mostly try to focus on sort of like how it works, and therefore, based on that, understanding what we can do about it.
Daniel Simons: And I'll add to that the other the other thing that we do sometimes comment on is the fact that people who do try and rip other people off, they tend to do it repeatedly. So if you look at their past histories, people who are committing fraud and convicted of fraud often had prior convictions for fraud, and people who say commit scientific misconduct. If you look back in their past, they probably did it before, right. So that in that sense, you can see some consistency there. But as Chris said, we don't really focus in on the motivations of the people who are trying to cheat us, it's much more about what are the set of vulnerabilities that we have, that makes it easy for them?
Sucheta Kamath: And I love that. So let's begin with what do you have a favorite con or scam that you came across? While working on this? I mean, right now, Netflix, and every platform is full of con artists and their stories, and they're jaw dropping and enticing. And you can drool over over it, because they're so juicy too.
Christopher Chabris: Well, let's see. I mean, there's so many of them. So good. I'm gonna pick a very commonplace one that everybody has talked about, which is the Theranos story with Elizabeth Holmes and her business partner, Sunny Balwani, who were both, you know, convicted of fraud in federal court, sentenced to jail, she's actually in jail now. But this all goes back, you know, 15 years or so to when Elizabeth Holmes founded this sort of biotech company that was going to create a according to her vision, going to create a device which revolutionized the blood testing diagnostic testing industry by enabling you to just you know, prick your finger, take out a tiny little amount of blood, something that barely even hurts no needles in the arm and all that stuff, and stick it in a little machine and maybe the size of a toaster or, you know, a small, you know, a small microwave. And, you know, minutes later get, you know, diagnostic test results out that would tell you whether you have HIV infection, whether you're pregnant, you know, what your, what your cholesterol is, you know, all manner of different tests. 1000s of them at one point, I think they claim this device would be able to do, and, you know, it's one, you know, one thing that happened was they never built the device successfully, they never succeeded in building a device, which lived up to these expectations. But that always happens all the time, you know, technology sometimes doesn't work out, the problem was that they claimed to have done it already. And they claimed to have deployed it. And they claimed to be, you know, offering that service to customers, you know, and, and to investors and to their board members, and so on. So that sort of the fraud was in sort of saying that these devices existed, and they could do all these amazing things, when in fact, what was happening in real life is they would sort of take a blood sample, you know, with a finger prick, maybe add some water to it, or some other, you know, solution to sort of, you know, make it more liquid and take it to some, you know, standard gigantic machine in a back room that took hours to you know, to deliver the results. So there was actually sort of a, an aspect of theatrical deception to it, right? They would have this machine there and sort of pretend that it was doing things and it had a special mode where it would make all the noises Without doing anything, just to sort of convince people that it worked, but in reality, the whole operation was going on, you know, almost literally behind that behind the curtain. I think that fraud is is interesting, you know, for a lot of reasons. But one of the one of which is, as we analyzed it, we found that, you know, Elizabeth Holmes took advantage of many of the different things that we talk about, in the book, many of the different habits that we have, and the weaknesses that we have.
Sucheta Kamath: You know, I think, and I have read everything possible, and seen every documentary and heard every podcast on Elizabeth Holmes myself. And one thing that struck me as you were saying, too, that she had no background in science. So she actually did not know how much amount of blood you need to conduct any test. And that didn't deter her. Like to me, I wouldn't even come up with that fraud, because I feel so handicapped, that I need knowledge and she had no qualms about it, that was very impressive. So that's a great place for us to start, I think in I love the way you have divided the book into two parts, habits and hooks. And you talk about the cognitive shortcuts and thinking patterns that are designed to optimize our functioning, but that also can become a handicap for us. So in this Theranos case, can you maybe korell relate some of the cognitive shortcuts that we engage in as a way to optimize our processing that Elizabeth, I don't know if she was that savvy to know how to deceive, but she once she got entangled? She continued to deceive really well.
Daniel Simons: Chris, why don't you go ahead and continue with that?
Christopher Chabris: Sure so yeah, I think your your question contains a very good point by itself, which is, we don't say that all of these con artists and scammers, you understand cognitive psychology really well. And they have a list of the things they're going to try to exploit, and so on. It's more like, guild knowledge or trade knowledge, you try a bunch of stuff over time, you see what works. And our approach was to kind of try to decode that and say, Okay, let's come up with a framework that explains like what they're doing, and why it works. We're not saying they know all these things in advance, but they sort of, you know, have the exits. It's like magicians can be great magicians without ever taking a visual perception course in college, right? They learn how to do it, you know, from the trade, you know, and from practice and all that. So I don't know how she learned how to do these things. But one thing that I would say, they definitely took advantage of, is they stocked their board of directors with a lot of familiar figures. It was a very unusual board of directors for a biotech medical technology company. There were no other CEOs of comparable companies on their board of directors that were no venture capitalists who normally invest in that industry on their board of directors. Instead, Henry Kissinger was on their board of directors, and George Shultz
Sucheta Kamath: Novices
Christopher Chabris: Yeah, well, novices on biotech and blood testing, but you know, geniuses on Foreign Relations and politics and so on. Right, so, but, but recognizable names, like sort of people, you would think, like, you know, everybody would recognize their names, you'd think like, these are great Americans like these are, you know, they have good judgment, you know, and so on. So, this was exploiting, inadvertently, perhaps, the hook that we call familiarity, so things that are familiar to us tend to seem more legitimate, acceptable, you know, credible, right, so you have a bunch of well known names on your board, people might invest in especially considering the kinds of investors they were going after, again, not biotech savvy investors, but they were going after investors, like, you know, publishing industry magnates, and, you know, the offices of, you know, rich families, you know, had to invest their money and so on, not necessarily experts in the sector. So they, we talk separately to, to a hedge fund manager, you know, who told us something like, the more retire generals, you see on a board of directors, the more you should want to short the company stock. That is, you know, so retired generals on a board of directors are sort of a sign that what, what they're trying to do is they're trying to sort of impress you, you know, with prestige and respect and, you know, other qualities than expertise in this industry in this business and this scientific technology. Right. I think that was one of the, you know, one of the clever things that they they wound up doing.
Sucheta Kamath: Excellent insight, Chris, I was wondering, Dan, what do you think about this particular hook that was used without any expertise in how the hooks and habits can be exploited?
Daniel Simons: Yeah, well, I mean, I think people do kind of know how to take advantage of the things that they find persuasive right there. Most of the people who are you know, like, like homes are really good salespeople, right. And salespeople have a lot of experience at persuading people and you You know, there's there's a long history of persuasion techniques that they just kind of whether they know them and studied them, probably not. But they fall into them because they're good at what they do. Another another habit that Elizabeth Holmes and Theranos took advantage of is focus, right? So, absolutely, when they did those demonstrations for, you know, industry executives or possible donors, they had them focusing on just what they were seeing. They're seeing them take a blood sample, they're seeing them, stick it into a machine, it's make beeping and flashing noises, noises, then they go on a tour of the facility or go to lunch. And while they're doing that, they're not being shown all of the other information, right? People assume that what they're seeing is all there is, and they don't think to ask, Hey, is that the machine right there that's actually doing this analysis? And if they'd asked that, they'd realize, no, it's actually not. But people don't think to ask about what they're not being shown. Right. So that habit of focusing in on something and not thinking about the information that you might not have is a pervasive and mostly really effective habit, right? It's really efficient to focus on what you've got, and not worrying too much about the things you're missing. But when there's somebody who's set up a fraud, or a scam like this, they take advantage of that.
Sucheta Kamath: You know, talking about focus, I was just yesterday, I was at an event and it was a little surprise party or celebratory party. And we my husband and I walked in, this was a lunch and didn't know what the surprise was. So turned out. My girlfriend's daughter, who got married a year and a half ago, is pregnant, and they were celebrating that it was great. And so they began to tell me the story that the young woman who's pregnant, her brother was coming from Florida. So she and her husband went to the airport to pick them up. And they had prepared a special sign. And in, you know, it had written let's call him, you know, maybe Ravi. And he says, Welcome Ravi to Atlanta. And, and in next to Ravi, it says, Ravi Uncle. So both the husband wife was standing at the airport with the sign and he's like, hey, oh, I'm so excited to that you came. I mean, it didn't think much about it. Because instead of Uber it's like my sister came, they got into the car. And the husband and wife were a little puzzled because he had no reaction. So they again flashed the sign and say, Hey, and he's like, Yeah, thanks. And then finally, they got so sick of it, and they say, read the sign. And then he read the sign. And then it says, Oh, Uncle, am I going to be an uncle? Then it dawned on him, he glazed over the sign because he thought, it's the typical sign of Welcome home, you know. And so that's a great illustration of focus. Like, we think we are very focused, he chose to see whatever he wanted to see, which is his name and welcome sign. He was not even contextually questioning, like, Why would my sister who picks me up every time is coming with a sign, you know what I mean? So so that's the blind spot in attention, that you talk a lot about. So I'm wondering if you could take a moment to maybe share some studies that have really captivated this nuance that we not only are taken by the information that's present in our visual processing field or processing field, but we also kind of get locked into not asking any question about what's missing.
Daniel Simons: Let me take that on a tangent for just a second. He said, that's a really interesting case. One thing it reveals is a theory of mind issue, right? Because you're the people who have the sign are thinking, Oh, whoa, this is so obvious. They he couldn't possibly miss this in the sign. Yes. And a lot of fraud works that way. Right? We see somebody who fell for a fraud, we think, oh, yeah, that's so obvious. I never would have fallen for that. When in reality, it's it's only obvious when you know, it's there when you know, to look for it when you know what's already happened. But in the moment, you don't necessarily think about those things. You're focused on what's right in front of you. And that's part of the danger here, right? That the reason we tend to think Oh, only gullible people get scammed is that we, when we hear about these stories, as you were saying, they're all over Netflix and documentaries, and they're all of these great podcasts and stories about fraud and scams. And when we're watching it from the outside after it's happened, it all seems kind of obvious, like, oh, yeah, how did they fall for that? It was just right there in front of them. They just, you know, it just missed it. But in the moment, it's much harder to do that, because we don't really think about all of the information that's right there. So it leads to this mistaken belief that, you know, people who fall for frauds are gullible, when in reality, we can all fall for them. If we're the ones who are targeted in that, in that moment. I think that's a really critical example of why this sort of focus matters. We're just not thinking about the information we have in the moment. We really are focused on what's right in front of us. And some of our own studies kind of have looked at this sort of metacognitive belief right. So the The Invisible Gorilla study, right is one of those cases where, you know, people are counting passes, and they're focusing on that intently they don't a lot of people don't notice a person in a gorilla suit walking through a scene. And when you ask them, they say this, what they don't believe you. But the critical thing to think about is, what if we never asked them? So we showed them the video, they don't see the gorilla, they don't comment on it. We never asked them about the gorilla. They'll continue to go through life, assuming of course, they would see a person in a gorilla suit something, it's just at the camera, right? Because we're aware of all of those times, we've noticed things we're not aware of all the times we didn't notice something. So unless it gets called your attention, you don't think about what you missed.
Sucheta Kamath: Yeah, and the other experiment that you have done with, or the, there's a staged fight that's going on, and you're running on a campus following somebody, and you miss the fight, like fight? Like, how would you miss the fight? And, and that also, I mean, so much of our detection, or our law, or our practices are dependent on people making good judgments based on being very aware and alert and attentive. And we're barely getting through life. I mean, I don't know how we are showing up at work. Or in the zoom right now. On this Riverside platform.
Christopher Chabris: One reason we did that study where we actually had people run past a staged fight and see whether they noticed it was because there was a jury trial where a police officer was convicted, essentially, you know, for lying, when he said he didn't notice some cops beating up, you know, beating up, remember that, yes. And so we were trying to sort of replicate those conditions as much as we could in a laboratory experiment, but also sort of, you know, push the envelope of how big an event, you know, can we show when a scientific experiment, people could not notice the original experiments in this phenomenon had like little dots flashing in the corners of screens and so on, then we have, you know, people walking through scenes and gorilla costumes, and so on, but three people fighting, you know, and not well, simulating fighting, of course, in the experiment, we didn't have anyone actually get injured, you know, and so on. It really shows I think that, you know, there's a surprising gap between the, you know, amount of stuff we can miss and what we believe we couldn't miss, right. And that's sort of gets back to Dan's comment about sort of like the retrospective, you know, what, what people think, in retrospect, here, you know, you, you don't realize it sort of like, at all the time, people made a decision, they had lots of options, there was lots of things they were focusing on and not noticing. But then when you see how it all ended, right, you can only see one pathway back, you know, what actually happened? And you think, Oh, well, that's ridiculous how they would possibly offer that it's kind of like a form of outcome bias, where like, the way the the way the story ends, sort of makes everything that happened in it, you know, more, more prominent, right? You don't, you don't need it anymore, think about the things that could have been done or weren't done or weren't considered, and so on.
Daniel Simons: That's something that magicians use all the time, right, that they set you up to think about one possible mechanism for their magical effects, right. And once you've logged on to that, you have no chance of figuring out what the actual method was, right? They've walked you into one interpretation, one sequence of events. And it's really hard in real time to think about what are all of the different paths that could be using right now, to get to the same end state, we just don't do that we take what they've claimed to have shown us as the path, right. And it's, it's a core to a lot of magic effects, right is to misdirect people by getting them to think about one explanation when it's not the right explanation.
Sucheta Kamath: And I also see that my husband and I are greatest fans of magic. And particularly, he's far more than I am. And and it's so interesting, because we kind of make the whole magic watching metta process, which is so annoying, because you're not not only not enjoying the magic, but you're also kind of not figuring it out. So it's the worst experience. And then, one time we were in Boston, and we went to see David Copperfield, and we were so certain we're going to figure this out. I mean, and he was on the stage. And I don't know if you've seen this particular act where there's like a wind blowing, and there's a windmill mill, and there's a curtain and, and he's standing right next to us, like literally next to us. And we are, and I'm like hit. So my, my husband and I were whispering like where he could be and he's standing right next to. So it is very clear to me that I am not only like, yeah, I walk into setups really well is what I've concluded.
Daniel Simons: I've heard I've heard some magicians say that, you know, we're kind of skeptical critical thinkers, like academic types are among the easiest to fool because they will almost unless you've gotten magical training, they're going to lock in on one possible method that they think they figured out and once you've done that, you're ignoring everything else you focused and that means you have you have no chance of finding out how they did it because you'll probably by the time you figure out what you think they've done, it's already done. Yeah, it's too late.
Sucheta Kamath: Right? Yeah, I'm feeling very sad right now about myself.
Christopher Chabris: It's really hard, I think because you really, you know, I'm going to, I'm going to be a little bit defensive on this one and say that I've gone to a couple of mentalism shows, and I think I have been able to figure out some of what's going on. But, but if I reflect back on this, well, I've read a book on mentalism, I've been talking to people about mentalism for years, and so on. And so I think this may be widens my field of vision a little bit. And I can think, Oh, well, like when the mentalists pretend to have like, figured out obscure facts about people just by like, looking at them, and so on, what they probably really did was look it up online beforehand, and knew that those people were in those seats, or somehow they were going to, you know, get them to volunteer. And, and so I don't know if this is right, in any particular case, but there certainly have been cases where, you know, it turned out that mentalists were definitely using social media, you know, before the show to find out things about people in the audience. Right. So we know that that's one of the, you know, one of the tricks they use, but but but to, you know, to go in without knowing anything about magic, right, you know, without having studied it, and then figure you're going to figure it out. I think as you know, debt, Dan knows more than I do about this, but I think they're ready for that one. Yeah, the people who think they're just going to figure it out sitting there in the audience, right? That's the bread and butter.
Daniel Simons: Well, and even even for professional magicians, right. And professional magicians can fool each other. Because they know a lot of different ways of producing the same effect, but they don't necessarily know which one you're using, and how you've set it up. So shows like Penn and Teller Fool Us are entirely based on the idea that you know, and they know every method that people are trying to use for the most part, right? So they should have guesses, but sometimes they still are not able to figure out which one was used because of how the setup, right so you know, even even as a professional magician, which are not, you can enjoy magic show, even if you think you understand how you're doing it because the odds are good, if they're good. They're gonna fool you.
Sucheta Kamath: And you know, the mechanism of being taken in is so psychologically complicated, I feel you have a great quote, you say all of us are capable of being fooled, probably in more ways than we realized. And more often than we are willing to admit. So to me the other inbuilt problem there is how do you tell somebody you are taking in? Because that's so embarrassing. So not only the people who fool you make you, I mean, take whatever resources from you, but they also strip you of your dignity. I think you can admit that in public.
Daniel Simons: And fraud is probably massively underreported because of that, you know, especially, especially things like say the Nigerian email scam, people who fall for that, in hindsight, realize how how much they missed, and are embarrassed. So a lot of fraud probably goes unreported because of that, that embarrassment of having fallen for something.
Christopher Chabris: I was just going to add there. There's a lot of poignant stories about victims of financial scams, for example, Bernie Madoff, many people were extremely embarrassed, it sort of ruined their lives in two ways, right? One, they lost all their money, or at least it took them a long time to get some of it back out of the bankruptcy process after Bernie Madoff's Ponzi scheme when when, you know, went bust, but then also they had to live with the knowledge that they themselves had been victimized in this way. And also with the knowledge other people knew that they had been victimized because that was, that was a public process, right? You had to go public in a way in order to, to get your money back. And I think it's true, there's an enormous stigma about this. And I'm not saying everybody should be happy about getting fooled, but it's, you know, there are a lot of scammers out there. And there are a lot of ways to get fooled. And, you know, you should you should read our book, and also realize that, you know, they're still scams that might be waiting for you, you know, even so you'll you'll have a better shot. And you know, and not, and not getting taken in hopefully, but there, there are so many different, you know, possibilities.
Sucheta Kamath: You know, I mean, this is sorry, a gross comparison. But, you know, the research on sociopaths and psychopaths, like their, I guess, I don't know if this is accurate, but close to 200 serial killers that are out there. And so your book kind of reminded me like there are at least 200, you know, serial sociopaths who are taking advantage of people and engaged in active fraud as we are recording this. And it's kind of discouraging, but also I think, what was really, I love your compassionate bend on it because you're saying, you know, have some self compassion that this is not your fault. But at the same time, there's some times you can be vigilant and sometimes you're not. It's okay. Like, I'm saying that very broadly. But yes. So let me ask you the next question here that, you know, so, I mean, how many more reminders do we need? How many fabulous films and books and stories and reports do we need to really know that scammers are at work so they continue to do a fabulous job and we continue to be pleasantly victimized. So why haven't we learned to avoid them?
Daniel Simons: I think one of the key things is that a lot of those stories, they are really engaging stories. And most of the ones that we hear about that are, you know, the focus of documentaries or podcasts are sort of more grand scams, right? Things that are done at scale or require a lot of sophistication. And I think we don't learn from them, because they're all great storytelling, and we can view them sort of from the outside watching other people get scammed. And they're not focused on how it is that the individuals who are victimized or what were their thought processes, what were their, what were their default assumptions leading up to them being scammed. Instead, we're kind of getting the idea of this, you know, really clever con artist who manages to fool everybody. Right? And those are, it's great. It's great storytelling, right? You've got victims, you've got, you've got this amazing, brilliant person who's tweeting, tricking everybody, you've got you in the law trying to catch up to them. I mean, they're it's great narrative. And they've been stories of that going back. You know, as long as scams have existed, which is pretty much as long as people have. But we don't seem to learn from I think, because they don't really focus on what it is that we need to pay attention to, in order to avoid getting scammed. So they continue to be great stories, because they're great stories. But that's not necessarily what gives us the lessons we need in order to recognize them. That said, watching a whole bunch of these, you do start to notice some similarities across them. And you might start to recognize that there are things you shouldn't do, right. So for example, if you've listened to podcasts, or movies about calling, call center scams, where they'll call people up and tell them, Hey, your immigration status is in danger, you need to send us cash right away, go to Walgreens and get a cash card and read us the number. Those sorts of high pressure scams. If you listen to that sort of thing, enough, you'll realize, no official organization will ever ask you to go buy a prepaid cash card and read off a number. It's always a scam, right? So you can learn specific cases like what not to do in a specific case by listening to these, but you don't necessarily generalize beyond that to novel variants of them. And that's where I think it helps to really focus in on the cognition that's underlying being victimized.
Sucheta Kamath: I'll give you a quick story. So my, somebody who works for me, texted me out of the blue. On Tuesday afternoon, I'm at Walmart. How many cards do you need? Should it be $50 or $100? And I'm like, Whoa, I said, I picked up the phone and called and I said, What are you doing? Why are you at Walmart? She said, Didn't you text me that you need gift cards? I said no. But I said you're at Walmart, like, the next step was to never pick up the phone. If you got a text from me. That sounds very aberrant. We don't give cards to anybody in our business. So anyways, I think that kind of captures this everyday gullibility. Somebody like says your boss called, and I'm at the I'm at Walmart now, like I'm going to buy.
Daniel Simons: It's a really common scam. Right? And it's, there's this sort of positive case of it, right? Where it's like, hey, we need to get gift cards for people. So go buy them for me, versus the, you know, the horrific ones where it's, you know, you're going to jail unless you send us money immediately. Right. But they all have that same structure. And so what does she know? It's a scam.
Sucheta Kamath: Yeah. So what what are the cognitive processes, for example, the person who worked for me failed, or kind of got captivated by one sounds like focused doing multitasking and just not investigating? Right?
Christopher Chabris: Yeah, one, one thing that like, makes the truth bias, like really, you know, you know, a dominant, you know, sort of mode of thinking is not having much time or attention or, you know, being distracted or whatever, right. So, one, one theory of truth bias goes that we accept whatever we're told, and it takes extra effort, more thinking a little bit of time to sort of re label it as questionable or false or unknown provenance or anything other than true. You know, so that's one thing. Obviously, there's Familiarity is being used, like often people will, you know, try to tap into sort of some relationship that already exists, which is what happened here, right? Like there's some other scammer somewhere who's basically getting the relationship between you and your colleague, you know, it'd be their thing. A lot of times it's family relationships that are used in some of the worst scams. In the case of Bernie Madoff, it was it was sort of, you know, he was a familiar figure. He was trusted. He was involved with lots of Jewish organizations and philanthropies. There was a lot of sort of familiarity involved there. It's all the same, you know, all the same things. You know, over and over again, in these you know, in these things, and I'm in I suppose also like, you know, there's one thing we should definitely consider, which is really hard to notice sometimes, which is that, you know, you don't know how many people they tried to scam on before they got to your employee, right. So it's not like the first time they tried it like, Oh, we got this, this works every time we try it, they might have sent out 100 Or more text messages like that until they finally found someone who started playing along, and then they're going to try to milk that one for as many gift cards as they possibly can over time. That's how all these selections games often work as they get one person on the hook. And then they try to get as much money as they can out of them over and over and over again.
Daniel Simons: Yeah, just, you know, my, my mother had exactly the same scam happened to her a week ago, right? But per person in her building, sent her an email saying, hey, you know, I'm sick, would you be willing to go get me a gift card that I can give to my kid? Right. And that happened to be somebody she knew in the building? Right? Probably, that person's email was corrupted or hacked. And they sent it to every single person in their email directory. And they just happened to catch my mom who knew them, right. And if you send it to everybody's email directly, they they're going to find people they know. So they might have sent out 1000 emails, just to kind of hook anyone. My mom fortunately, thought that sounded a little weird and called the person but not not before trying to reply to the email and getting no response. So wow, it's easy to fall for these sorts of things.
Christopher Chabris: She kindly read our book already. And yeah, many helpful edits. So she was armed with the necessary defenses.
Daniel Simons: A little skeptical going. Yeah.
Sucheta Kamath: So another habit that you talked about this cognitive in a shortcut is prediction? Can you speak a little bit about how that works, and how that makes us victims of our own life experiences?
Daniel Simons: Sure. So I mean, you can think about, you know, prediction is a lot like thinking about what our base rate of experiences are, what we're expecting to have happen is kind of going to be likely similar to the things that have happened in the past. So we kind of have anticipation of what we're going to run into, right? That's just kind of a basic idea that, you know, you kind of make predictions about what's going to happen, and you think about what's going to happen, and you maybe have a favorite outcome. But we don't tend to question, right, whether that prediction was good in the first place. We don't, when somebody hands us a result, that looks really appealing, we don't think, okay, was this something I predicted, and if it was, maybe I should be more skeptical, because what a scammer is going to do is give you exactly what you are hoping for exactly what you desire. And if it's exactly what you desire, you should be questioning of it. So we give an example of journalists who, you know, are fed a document that just perfectly matches exactly what they would hope to have seen. This was going back to the George George Bush era, when CBS was reporting about these documents about what George W. Bush had not done or not done during his military service. Right. So there were lots of speculation about questionable, you know, drug behavior, and not fulfilling his responsibilities, those sorts of things. And somebody gave them the document that perfectly fit their narrative what they hoped for. So they ran with it, and it turns out to have been a fake, right. So it was exactly what they predicted. And when that's when somebody comes to you with exactly what you were predicting or exactly what you were hoping for, that's probably when you should be most suspicious, as opposed to least suspicious.
Sucheta Kamath: It brings to mind though, that you're really talking about higher order thinking, like the habit of questioning is something you can develop and learn as a critical thinker. And I feel not many people are critical thinkers, I'm sorry. And, and I, I don't mean in a mean way. But I think people, one of the ways that, you know, we educate people to kind of come to a place where you rely on information being correct, truthful, it's a way to gain knowledge. And if you question you look like an outsider.
Daniel Simons: This is something that you should be questioning for yourself, right? So everybody makes this mistake scientists who are trained to try and evaluate critically, think about it that if you if you look at your data, and you find that the result looks really weird, right? You're going to double check and make sure everything was coded correctly. Yeah. What if the results came out exactly as you predicted? Would you be as likely to double check? Would you find those coding mistakes that you'd made if it was matching what you were hoping for? Right? And that sort of, in general, you wouldn't write you wouldn't be as careful and scrutinizing something that you agreed with and something you disagreed with. And that's, that's just a general tendency. We do that most of the time, right? And most of the time, our predictions kind of match up and we want to find things that look like what we want to find out. We want to kind of confirm our beliefs. So it's only in those cases where somebody might be taking advantage of it, or when we might be fooling ourselves that you really do want to kind of get into a mindset where you think more critically. Right. And that's hard, right. But I think the key is noticing when you're in one of those situations. So, as somebody who does a lot of data analysis, I know that I'm going to tend to check much more carefully, if something doesn't look right. Right, most of the time, that's a valid reason to check. But ideally, we should have something built into our process, so that we check all the time and not just when it's, you know, looks weird.
Sucheta Kamath: You know, I have a concept that when when I work with executive function training is it's called spot checking. That means you pick five spots that are random, in your own experience, where you know, if there's something aberrant that will stick out, but there is no error, and then you can actually question why there is no aberrant because aberrant is the norm, at least the way I see it. And funny thing, you mentioned this, because last week, Evie and I, who's a producer, who's on the call, we were in a meeting, and we were looking at a graph, and the data had shifted. So from this level, to next level, and then the third level, and there was a huge jump. And I said, I was saying in the meeting, I have never seen this kind of jump. And what is what an incredible progress. But I've never seen it, I kept seeing I've never seen it. And then slowly, Evie kind of came and said, I think I juxtapose the data. It's not supposed to be. I was so pleasantly taken enamored by progress, that if I had not kept saying that again and again, that registered, we would have gone along reporting something erroneous, because we are so mesmerized.
Christopher Chabris: Exactly, this happened. This happens all the time. I've been in meetings, I'm sure we've all been in many meetings where a pleasant looking, you know, graph is shown and almost nobody questions it. And in fact, you feel bad, you know, you feel a bit antisocial, if you're the one guy who says wait a minute, like, that's what could possibly explain that aside from we're doing great, you know, much greater than we thought we were doing or something like that. You can go the other direction too. Like if there's a if something starts going badly, people might then you know, start to think more critically, like, Well, wait a minute, is that the data real like that? Was that analyzed correctly, and so on, but it's the satisfaction of the excellent expectations, or the prediction, or, you know, the outcome that we hope to see that sort of disarms, you know, the critical thinking. So I would put a little bit of a, you know, a little bit of a gloss on what what you said earlier, it's not like there are some critical thinkers, and everybody else is not a critical thinker. I think everybody can think critically, the question is, do they apply it, you know, to a random sampling of situations? Or do they only apply it to when their own hopes and expectations are not met? There's some very nice scientific experiments showing that, for example, you know, people on the political right, you know, will question data that suggests that gun control reduces crime, but they won't say, sorry, they will question data that suggests that gun control reduces crime. But but won't question data that suggests that it increases crime, right, or climate change, or, you know, many other examples of this, where, and it reverses. It's not like the people on the left are always thinking critically, and reverses depending on whether the data, you know, supports your outcome. And the trick is to become, you know, a critical thinker who's not governed as much by by that, but you have the potential within you to think critically, people who are anti vaccine will very critically, you know, pull apart pro vaccine studies, you know, they might be wrong, but they will apply critical thinking to it, it's not like they can never think critically, they can do it when they don't like, you know, when they don't like the outcome.
Sucheta Kamath: And, you know, I have a theory about this, that I think I've had Tim Pychyl who studies procrastination, and he says, all pain management, all time management is pain management. So if you think about this, this issue that we're talking about is really, when, you know, truth hurts, as they say. So when you see something that's not favorable, it leads to some discomfort. And we basically are wanting to avoid the discomfort, whether it's not in your favor or, you know, information. So I mean, Bernie Madoff, there was a wonderful documentary, which I watched, again, in preparation for this. But there was a guy in Europe who was investing money. And he says, I had never come across anybody who instructed their client to never talk about this investment opportunity. He says, that was anti investment. If you are the person who's making tons of money, you want tons of money coming your way. And he says that itself to me, and when I ask that everybody will kind of poopoo him, you know, so So I think the discomfort of data shows amazing progress. Well, it's great, that means yay. But what if it's not true, and that pain is what we need to come to manage? So it sounds like there is a little bit of greed associated with the joy or that that we are also not coming to terms with maybe?
Daniel Simons: Greed or or kind of reinforcement of our beliefs, right? I'm not sure. Yeah. I mean, it's people like having what they think confirmed. Right? And, you know, if if the things you're seeing are confirming that you don't think to question them, right. And that's the that's the real challenge here, right? There are interesting cases. And in all of these scams, right, there are people who didn't fall for them. And the question that's interesting is, what was different about them? What were they doing differently than the people who didn't fall for them? So the example you just gave is somebody saying, wait a second, what investment manager who's running a giant fund wouldn't want to advertise it and try and get more investment? Right? You know, in the same way that, you know, people might ask, why, why aren't you giving me all of the, you know, details of these trades? Right? Or you're giving all the details of these trades, but you're not, you know, what, why is why is there no variation in your, your results, right? So there are lots of these cases where people didn't fall for it. So I think my favorite recent one is, Taylor Swift is the only person not to who was asked to endorse, you know, cryptocurrency and cryptocurrency and she said, Wait, isn't this just like an unregulated security? And nobody else is asking that. Yes. So, you know, kudos to her for kind of going into this doesn't make sense.
Christopher Chabris: Yeah, there's one of the one of the habits we talked about in the book is that Dan's referring to is, is what we call efficiency, right? So we like to make decisions quickly. We don't like to think too much, you know, we're busy, and slowing down and asking questions, sometimes out loud, but even just have yourself, you know, or have the data in front of you, or something like that, you know, is really a critical step. And, you know, Taylor Swift asked the right question and didn't get caught up in this, you know, one of these whole crypto scams, which by the way, she could have been on the hook for like, people who have endorsed cryptocurrencies cryptocurrency scams, have themselves been sued or prosecuted, you know, for, for basically promoting illegally promoting securities, right, but you can't just go out and, you know, promote, you know, illegal investments or, you know, bogus investments. Some of the people literally did ask Bernie Madoff questions, and when he wouldn't answer them, they didn't invest others, maybe just ask the question more generally, and, and realize now that, you know, just by asking the question, sometimes you can realize that something's wrong, right, if you just surfaced a question, you know.
Daniel Simons: Thinking that you need to ask a question, just asking yourself, Is that really true is often enough to kind of get you thinking about whether it makes sense. I mean, if you see something shared on social media, and it seems to be perfectly in keeping with what you hoped to see, just asking you, is that really true? And thinking about it for a minute, sometimes we'll realize now that that can't be right.
Sucheta Kamath: So you have a lot of tricks and suggestions. So as we think about their future, I, I wrote down some big ideas that spoke to me. And I wanted to just review, and I'm sure you have some ideas that you want the listeners or readers to figure it out. But one thing you talked about, which kind of we touched upon, was blunder check. Am I making a simple mistake? Can you tell us a little bit how we can apply that to our everyday life?
Christopher Chabris: I'll tell you the analogy right yet the analogy comes from chess and probably other games where...
Sucheta Kamath: I saw you, Chris there.
Christopher Chabris: Well, exactly. So the you know, often, you know, when you're playing a serious chess game, you know, in a tournament, it's like a three, four hour long event, you do a lot of thinking each move, maybe you think four minutes for a move or something like that. And you can get so caught up in what you were thinking that you might, you know, have actually forgotten to consider some possibility for your opponent or something like that. So many coaches, especially tell children, but any chess player, you know, before you actually touch a piece and move it, like before you make a real decision, you know, that affects the future, you know, consider whether, you know, there are any obvious flaws in what you're doing that you haven't noticed before. And we think that applies more widely to lots of decisions. It's not always obvious how to find those flaws, you need to know something about what you're doing. One way that people try to sort of formalize this process often is by using checklists. So you know, the checklists that pilots like go through before they fly the plane isn't a way of preventing. It's a way of preventing blunders, in some sense, right? Or the, you know, all these switches where they're supposed to be by chance things and so on, like missing any of those things would be like making a blunder in a chess game, right? Like overlooking something obvious, right? So checklists, you know, can of course help but you need to know what it is you're doing in order to have the right checklist, you know, for that, but it's always good to think of keep in mind that, despite all your efforts, you might have missed something simple.
Sucheta Kamath: And you know, Can I say something about though, the fact that you brought the reference to chess of chess by default is a very critical thinking game. And also it requires a lot of working memory to do something called hypothesizing that means, if I do this, then this will happen. But what if I Do this. And it requires you to hold on to all these multiple open endedness in your working memory. And many people struggle with chess, they cannot play the game. And I no knock at them. But I'm just saying this habit that you're talking about is a very beautifully cultivated through chess playing as a very deliberate effort to improve critical thinking skills. So I do think that some people just don't have that experience or exposure, or practice to that kind of thought process. Do you agree with that? Or do you think there's some substitutes for that?
Christopher Chabris: Well, that's where I, that's where that's where I would get the idea. But there's plenty of other games besides just that you can get the same idea from probably people don't wander check enough in any other game they're playing, right? Like, they don't, you know, so it doesn't have to, it doesn't have to be chess, I agree with you. Chess is just as especially focused on the idea that I think we sort of talked about earlier that there should have like multiple paths into the future. Yeah, you need to look at them. If you just look at what happened at the end of the game, it can seem obvious why somebody lost, but there were so many choices they had to make, and so many different ways their opponent could have played. That's a good way of thinking in general, but I don't think we need to worry about all that complexity in order to you know, reduce our chances a little bit of getting scammed. It's sort of, you know, it's it is kind of like an executive function thing. It's like an inhibition thing, like, you know, stop, you know, stop, you know, before, you know, before deciding and check, you know, and check a little bit.
Daniel Simons: An important point here, right, is that as you were saying chess, chess requires a lot of executive control, working memory, sorts of processes. But you still need to do a blunder check. Even if you're really really good at chess, right? You're likely to miss something obvious, right? So it's not it's not something that is just for people who are having trouble with those sorts of things. This is something that applies to everybody to think about cases, when you might have just missed something obvious. And it might be something obvious that you need to look for right, then.
Sucheta Kamath: I love that. And I was going to mention two more before we end, the second one was a pre mortem, I thought that was also really a wonderful strategy. Can you walk us through then? How do we do that?
Daniel Simons: I mean, the idea is, if if things were going to go off the rails, right? Before you actually do something before you actually engage in thinking about, okay, if this were going to go bad, how might it go bad? You know, what, what might go wrong? And it's a little like thinking like a scammer, right? So if somebody were going to scam me, what would they do? How would they take advantage of me? How would they try and trick me? And just thinking about that a little bit in advance, can often help you head off cases where you might be, you know, might be fooled. Right? So that's a really important thing to do for any sort of big decision that you're making. Okay, if this were going to go terribly wrong, you know, catastrophize a little I mean, keep it under control. But catastrophize a little bit about what's the worst possible thing that could happen here? How bad is that? Right? And what could I do to make sure that doesn't happen? Right. And anticipating that a little bit, you know, makes you think about it? So if let's say you get that gift card, email, right? You think about Okay, should I go buy gift cards? Well, what's the worst thing that could happen? Well, worst thing could happen is that it's a scam, where it's a fake? In which case, you know, what's the solution? Well, I can I can head off that fake, that worst catastrophe by just calling the person who asked me to buy these things before doing it? I mean, that's not a big stakes decision. But it could be, huh. Yeah, Chris, you want to add?
Christopher Chabris: Even worse, I mean, there's even even worst case of that, which is it could be some form of identity theft, where it's not, you know, it could wind up in some kind of situation where you give more information to them, right. And then, you know, even worse things happen. And just we buy a few $100 of gift cards, which is a lot of money, but still, like it's a limited, limited impact, right. But identity theft could have a you know, sort of open ended impacts on sometimes depending on, you know, how soon you catch it, and so on. And the pre mortem idea, by the way, comes from Gary Klein, and Danny Kahneman has also written about it, I don't want to pretend that we're, you don't want to pretend that we invented it, but I think it's very underutilized, which it really should be utilized more. And, you know, I think one way to do it is like, actually imagine yourself in the future, then you actually did all this stuff, you know, and then you've, you know, you've lost all your money. Well, how could that have happened? You know, and then that might tell you like, what, what are the some of the things I shouldn't, you know, I shouldn't be doing?
Daniel Simons: Yeah, how can I get from where I am to the bad outcome? And what are the steps that would have led to that? Yeah.
Sucheta Kamath: And again, I think to me, that sounds again requires a little bit emotional courage to withstand the possibilities of being unfavorable, unsavory outcomes, but just simply engaging, that hypothetical analysis synthesis can go a long ways. So last one that I really thought was a wonderful way to end this conversation. And I'm sure there's so many amazing ideas so sorry to condense them into just a few but not to sweat sweat the small stuff that was your advice because it's being receptive to tiny scams. It's like getting paper cut is what I saw.
Daniel Simons: Yeah, be able to prevent everything every scam you're not gonna notice many of them but, you know, if you let's say you get ripped off at the grocery store, because the price on the shelf is slightly different from It rings up on the receipt, if you can afford that loss, it's probably not worth your time to check every single price against every single value on the receipt to make sure it's perfect. These, yeah, maybe the errors will be systematic, maybe they'll add up and benefit the store as opposed to you, or maybe they're random. But they're probably not huge, right? Those sorts of errors probably aren't huge errors. So it's probably not worth most people's time to spend time worrying about that. Whereas if you're buying, if you have the money in interest to buy fine art, it's probably worth all of your time to really thoroughly investigate what you're buying. Because the returns to a scammer if they get you to buy a faked painting are gigantic. And the costs to you are gigantic. So you want to actually think about what's what's the cost here versus what's the opportunity, right? If if the risk to me is minimal? Do I really benefit from being skeptical all the time? Right? If the rest of you is huge? Yeah, you probably do.
Sucheta Kamath: It's a nice way to maintain some sanity or some sense of you know, humanity in ourselves. Like we cannot be this superhumans. We're preventing everything.
Daniel Simons: And you wouldn't want to be right. I mean, what a horrible way to go through life constantly. second guessing everybody you interact with, and most of the time, we're not getting scammed, right. Most of the time, people are honest brokers. So you want to treat the world that way. It's just that when you're a big risk, you want to pay attention to that. Oh,
Sucheta Kamath: Well, thank you for being so wonderfully honest about these difficult topics and really amazing book. listeners. This is launching just last I mean, it launched yesterday. So get your copy. We'll be adding a link at the end of the in our show notes. Definitely. So before I let you both go, I have two questions. One, have you been screened scammed? And and are you willing to share a personal story? And your favorite recommendation in addition to your two books.
Christopher Chabris: There are some examples in the in the book of ways that we were ways that we were scammed. There's one story we tell where and maybe this will cover both. And but you know, both me and Dan says the same thing happened to both of us. A guy emailed us separately, totally separately, not not up together, anything like that, wanting to engage in research project with us. And there were two different research projects. And you know, there were some discussions and so on were had and some emails were exchanged. But in neither case, did it actually go anywhere.
Daniel Simons: I talked on the phone with him a couple of times, I think.
Sucheta Kamath: That was surprising to me, like, contacted both of you.
Christopher Chabris: Yeah, I think, well, it was a guy who was in he wasn't some random person from nowhere. He was a guy who was sort of somewhat in our field he was involved in, in cognitive science, you know, in cognitive psychology.
Daniel Simons: Generally, and have inserted themselves into the field. Not sure he actually had done anything in it.
Christopher Chabris: But he wasn't like some, or you know, like, right, just was trying to prey on academics or whatever. Like he Well, okay. It turned out he was a little bit of a fraudster. So we never actually did any projects with him. I talked to him on the phone, too, as I recall. We never did any projects with him. But years and years later, we found out that he had actually been involved in many legal cases. Mostly in California, I believe, where people were suing him for, you know, 1000s of dollars here and there because of deals they had made that he didn't come through on and, and so on. So it seemed like he was sort of the, you know, the kind of guy who was a, you know, a hustler a bit of a bit of a scammer, you know, in addition to, you know, the other stuff. He did it, it made it made me, you know, at one point I regretted that we I was never able to do that project, because it would have been an interesting project. But no, I'm happy that I never talked to the guy again, wound up, you know, it might have wound up with me having sent him money for something and then having to sue him, you know, to try to get it back or something like that.
Daniel Simons: Yeah, it was kind of a few that was close. Yeah.
Sucheta Kamath: It was interesting that he had done multiple times to your earlier point, like it's not a one time show. It's like a repeat. Yep.
Daniel Simons: All right. So we learned a bit much later, actually, we learned we learned about this whole his whole history as a scammer after he had died, kind of under somewhat weird circumstances.
Christopher Chabris: So while writing the book, we went back and looked at all this up while writing the book, you know, so it was sort of a revelation to us. Not that long ago.
Sucheta Kamath: Oh, my goodness. Well, that was very interesting stories. Thank you. So as we end we always ask our guests who are prolific researchers and writers authors, do you have any book that you found influential or interesting and you think our audience might enjoy.
Daniel Simons: So influential and interesting. For me, I think it's the book that I found most influential in my career was Ulric Neisser's Cognition and Reality. I don't know if it's even still in print, but it was from the late 1970s. And it was sort of his book after he had made a transition from traditional cognitive psychology having written one of the first cognitive psychology textbooks in the 60s, to sort of have more ecological approach to memory and perception research in the early 70s. And this was sort of his kind of rethinking of how cognition works. And for me, it was a hugely influential book, it still is one I think about all the time. Amazing, his work has been more than anybody else's, probably had a huge influence on my career. So yeah, that would probably be the one I would mention as an academic book. Yeah.
Sucheta Kamath: Thank you. That's great. How about you, Chris?
Christopher Chabris: I'm gonna go much more recent than the seven days. And actually, maybe I'll give you two if that's okay. Because one of them is one of my favorite books, in this sort of behavioral science, you know, domain, and it's called, Everything is Obvious: Once You Know the Answer. And it's by Duncan Watts, who's a sociologist, and computational, computational social scientist at at Penn. And it's sort of about all of the illusions of cause and effect. And we've sort of talked about them to some extent in this discussion. And it's like a theme that goes through, you know, a lot of our a lot of our writing, but I think Duncan has written a wonderful book about it with lots of interesting examples and ideas. The other one is a very recent book by Annie Duke called Quit. And it's about the virtues of giving up, at least in the right circumstances, and how, you know, there are sort of various illusions and, and biases, that sort of bias bias us towards keeping on doing what we're doing, as opposed to stopping and maybe doing something else. And I think it's a wonderful example of sort of applying ideas from behavioral science, you know, in a clever way to what at first seems like a negative behavior, but showing how it can often be a positive, a positive behavior.
Sucheta Kamath: Amazing. Well, thank you for these recommendations. And all right, that's all the time we have today. Thank you again, for Dan and Chris, for being my guest today. As you can see, these are very important conversations we're having with knowledgeable, incredibly qualified and passionate experts, whose unique perspective, not only in their area of expertise, but also its relationship and impact on executive function is informing us to lead a better life. That's my hope. And of course, if you love what you're hearing, please share this episode with your friends and colleagues. Please get this book ASAP as well as share this book. It's really, you will be only lying to yourself if you do not admit that you have been a victim of a fraud. I'm just kidding. But anyways, lastly, again, I look forward to seeing you all once again, right here on the next footprint prefrontal podcast. Thank you again, Dan. Thank you, Chris, for joining me today.
Daniel Simons: Thanks. It was great chatting.