Unconscious Influence by Stephen Duneier
Over the years, I must have given well over a hundred talks touching on the subject of cognitive bias in some way. The residual effect being that anytime something comes up in the news where it appears that someone has fallen prey to it, but doesn’t realize it, an article will be forwarded on to me along with the caption, “thought you’d enjoy this one”. What makes this topic so fascinating, is that tiny little section of the previous sentence, “but doesn’t realize it.” In a conversation with a subscriber last week, he began by saying, “I understand most of what you write about. For me, it’s intuitive.”
This belief that “it doesn’t apply to me”, is a fundamental aspect of the phenomenon. It’s right there in the very definition of cognitive bias. An unconscious influence that produces systematic errors in judgment.
Here’s an example of cognitive bias at work. On October 25th, a conference organizer sent me this link along with the following commentary. “gotta love his quote! => ‘The model predicted a Trump win in February and nothing has changed since then. Whatever happens in the real world doesn’t affect the model,’ he said.” (Emphasis is the sender’s.)
At the time, a Clinton win was all but guaranteed. To the person who sent it to me, it seemed ludicrous that the prognosticator mentioned in the article wasn’t updating his model as new information was being gathered. He shared it with me because he felt it served as a prime example of someone exhibiting flawed decision making. It was not.
The question is, for what reason should the model have been updated? This same model had been covered by the press ad nauseam when Trump first won the nomination. At that time, it was crystal clear that the model had accurately predicted the winner in all but one presidential election going back to 1912, and it had done so without gathering additional data post the primaries. If there were a flaw in ignoring post-primary data, it should have been pointed out back then. It wasn’t. However, because his model had predicted a Trump victory, and the latest polls had swung so far in the opposite direction, his model suddenly appeared deficient, and his defense of it categorized as, absurd.
The reality is, changing his model to reflect the latest polls or anything else simply because someone, even everyone, deems it worthy of inclusion, would be a mistake. However, this is what is done all the time in this industry. We develop a process and then adjust, amend and contort it in the moment to accommodate the latest unemployment data, the most recent central banker quote selected by news sources and whatever else happens to make its way into our line of sight. Without even recognizing it, we undermine the process, converting it into an inconsistent mess, all while rationalizing our irrational behavior. That is what happened to the conference organizer. He was swayed by the latest polls, and mocked the prognosticator who had done his research and developed a model backed by data-driven evidence. Without even realizing it, he was exhibiting cognitive bias, he was making a mistake, but most importantly, he had no idea it was happening.
My response at the time was, “I have some real sympathy for his argument. His model is based on certain factors, what he defines as ‘signal’. Everything after the primaries is effectively ‘noise’, according to his model. There is no reason for him to make adjustments, regardless of how wrong it may feel in the moment. The historical returns on the model, which is designed to ignore this updated data, are very good. If the model doesn't make sense now, then it didn't make sense when everyone was first quoting it. If they didn't call BS on it then, they shouldn't be calling BS on it now. Those who are, are the ones exhibiting cognitive bias.”
Last week, students in my Decision Analysis class were reading about how our brains will often replace a difficult question to answer with one that is much simpler, without us even realizing it (see Prosecutor’s Fallacy or Are You Asking the Right Questions for examples). During one of my lectures, I presented the proper methodology for assessing the value of a weather app to a particular user of the app. Two students challenged my conclusion, to which I offered extra credit if they could prove their argument mathematically. Although their proofs were wrong, their responses brought a smile to my face. Here were two of the brightest students in the class, who clearly understand the math and the concepts that we cover, yet they had just exhibited the exact cognitive bias they were reading about at that very moment. They had employed heuristics which triggered a bias that they were well aware of, but they had no idea it was happening. That is the nature of cognitive bias. It doesn’t matter if you’ve studied it. It doesn’t matter that you understand the correct way to calculate the math or conduct an analysis. It doesn't matter if it is intuitive to you. What matters is that you correctly apply what you understand to be the right approach to your decisions at the moment it is required.
On LinkedIn, I recently posted the Monty Hall Problem that I had shared with subscribers last year. The post reads: “The host begins by unveiling 3 numbered doors & explains that behind one of the doors is a brand new car, while the other 2 each contain a goat. The contestant selects a door gets to keep whatever is behind it. After the initial selection is made, the host opens 1 of the remaining doors to reveal 1 of the goats. The contestant may then change their selection. Q: Should the contestant switch to the other unopened door? A: Although most people intuitively and confidently answer, ‘it doesn't make a difference’, the only rational answer is, YES!”
Not surprisingly, it drew a rather heated response. “This ‘analysis' is total BS. The host knows where the two goats and the car are. No matter which door the contestant picks, the host will reveal a goat behind an unpicked door. At this point -- which is when the contestant makes the switch/no switch decision -- one unopened door has a car and one a goat. There is an equal probability of the car being behind either door and the contestant has a 50% chance of winning by staying with his original pick and a 50% chance of winning by changing. There is no advantage (or disadvantage) to changing picks. This isn't foolish intuition, it's REAL decision science. You should refund your fees if you are charging clients for the nonsense presented here.”
The response came from a Harvard MBA with more than 30 years of experience in finance. In other words, a smart, highly educated, experienced and successful person had not only made a mistake, he was absolutely certain his logic was correct. His intuition was so powerful, not only didn’t he hesitate to correct me, he chose to publicly chastise me as well.
I shared his response with my students, because it serves as powerful evidence of the “unconscious” aspect of cognitive bias. We just don’t know it’s happening to us. My students seemed more excited to read my response, though. “Oh I bet your ripped him to shreds, didn’t you?”, one student grunted out loud. I did not. The reason is, every one of us makes mistakes like this all the time, and we are equally unaware and equally confident in our intuition. (Recall one of my most humbling moments here.)
“:) You're not the first to vehemently oppose the solution. When the problem and solution were first published (back in 1975 and not by me) it created quite the stir among some of the best minds in mathematics. In the end, those who initially argued your conclusion came around when they realized the flaw in their argument. I understand your frustration. It's the nature of dealing with cognitive bias. It's almost impossible to see it in ourselves, particularly when our intuition is so incredibly convinced it is correct, which is clearly the case for you here. And understandably so.”
To which he responded: “I looked into this further and I am wrong. It was NOT my intuition; it was an error in my analysis. What I missed is that there is POSSIBLE new information in the door selected by the host. When the host selects a goat door to reveal, it may be the case that he must select that door because the car is behind the other door. Switching improves your chances of being right because you leverage this possible new information. Not to shoot the messenger, but I think if you explained the problem the way I just did it would persuade more non-STEM types that you are correct!”
I give him credit for taking a moment to look into it further (even if it did come after his initial comment). However, he missed the point. It is not my job to properly frame the information for him. In the real world, YOU are responsible for taking in data, commentary and every other form of information, and then processing that information in a way that delivers an accurate representation of the world. Unfortunately, that information is often framed in a manner that is purposely designed to trigger cognitive bias. It is actually meant to trigger an emotional response or to lower your defenses. In the above case, I provided all of the information necessary to properly solve the problem. To blame me for his error, is like blaming the newscaster or the analyst for our inability to proper assess the macro environment.
Another person chimed in as well. "It can be made clear even to the intuition if you increase the number of doors (and hence the information gained from the host's choice). So if there are 100 doors and you pick door 23, then the host opens all other doors except 23 and 96, it should be clear intuitively that it is more likely to be behind door 96 than 23." He too may be correct, but it’s important to understand that what appeals to one person’s intuition doesn't necessarily appeal to everyone's. The better we understand how we intuitively approach problems based on our history and education, and the more we challenge that intuition to see beyond what automatically comes to our mind, the further we can expand the "box" from within which we make all decisions. My goal in sharing the problem was not to teach someone how to solve this one problem, but to show that we are all vulnerable to decision making mistakes, even smart, educated people and even on relatively simple problems. In order to improve our decision making skills, this must not only be understood conceptually, but experienced first hand. Only then can a decision maker experience a leap forward in the evolution of their decision making. In other words, only after you realize that it doesn’t matter how many books you read on cognitive bias or how many biases and heuristics you can name off the top of your head, you will still be vulnerable to these types of mistakes. Only then can real progress be achieved.
About the Author For nearly thirty years, Stephen Duneier has applied cognitive science to investment and business management. The result has been the turnaround of numerous institutional trading businesses, career best returns for experienced portfolio managers who have adopted his methods, the development of a $1.25 billion dollar hedge fund and 20.3% average annualized returns as a global macro portfolio manager.
Mr. Duneier teaches graduate courses on Decision Analysis and Behavioral Investing in the College of Engineering at the University of California. His book, AlphaBrain, is due to be published in early 2017 (Wiley & Sons).
Through Bija Advisors' coaching, workshops and publications, he helps the world's most successful and experienced investment managers improve performance by applying proven, proprietary decision-making methods to their own processes.
Stephen Duneier was formerly Global Head of Currency Option Trading at Bank of America, Managing Director in charge of Emerging Markets at AIG International and founding partner of award winning hedge funds, Grant Capital Partners and Bija Capital Management. As a speaker, Stephen has delivered informative and inspirational talks to audiences around the world for more than 20 years on topics including global macro economic themes, how cognitive science can improve performance and the keys to living a more deliberate life. Each is delivered via highly entertaining stories that inevitably lead to further conversation, and ultimately, better results.
His artwork has been featured in international publications and on television programs around the world, is represented by the renowned gallery, Sullivan Goss and earned him more than 50,000 followers across social media. As Commissioner of the League of Professional Educators, Duneier is using cognitive science to alter the landscape of American K-12 education. He received his master's degree in finance and economics from New York University's Stern School of Business.
Bija Advisors LLC In publishing research, Bija Advisors LLC is not soliciting any action based upon it. Bija Advisors LLC’s publications contain material based upon publicly available information, obtained from sources that we consider reliable. However, Bija Advisors LLC does not represent that it is accurate and it should not be relied on as such. Opinions expressed are current opinions as of the date appearing on Bija Advisors LLC’s publications only. All forecasts and statements about the future, even if presented as fact, should be treated as judgments, and neither Bija Advisors LLC nor its partners can be held responsible for any failure of those judgments to prove accurate. It should be assumed that, from time to time, Bija Advisors LLC and its partners will hold investments in securities and other positions, in equity, bond, currency and commodities markets, from which they will benefit if the forecasts and judgments about the future presented in this document do prove to be accurate. Bija Advisors LLC is not liable for any loss or damage resulting from the use of its product.
Performance: Enhanced
Learn how Bija's proven, proprietary approach to decision making helps the world's top institutional investors generate better results.