Subscribe now and watch my free trend following VIDEO.

Ep. 193: Gerd Gigerenzer Interview with Michael Covel on Trend Following Radio

Gerd Gigerenzer
Gerd Gigerenzer

Subscribe to Trend Following Radio on iTunes

My guest today is Gerd Gigerenzer, the director of the Max Planck Institute for Human Development in Berlin, and a former professor of psychology at the University at Chicago. Gerd is also the director of the Harding Center for Risk Literacy (read David Harding the head of trend following firm Winton Capital).

The topic is heuristics.

In this episode of Trend Following Radio we discuss:

  • Uncertainty
  • Comparing decisions to baseball (gaze heuristic)
  • Complex problems and simple solutions
  • Using price action as a decision making cue
  • Unconscious heuristics
  • The art of knowing what one doesn’t have to know
  • The less is more effect
  • The miracle on the Hudson River a few years ago as a case in point illustrating heuristics
  • The idea of an adaptive toolbox
  • The element of surprise in Gigerenzer’s work
  • The distinction between risk and uncertainty
  • Intuition vs. rationality

In this episode of Trend Following Radio:

  • Why uncertainly and risk are not the same thing
  • How we use heuristics to make decisions
  • Why complex problems don’t always require complex solutions
  • Why heuristics and conscious reasoning are both important
  • “Less is more” – the art of knowing what you don’t need to know
  • How these methods are applicable in investing, management, law, and many other areas
  • What defensive decision making is and why you need to know

Mentions & Resources:

Listen to this episode:

Jump in!

[toggle Title=”View Full Transcript”]MICHAEL: Today on the show, I have Gerd Gigerenzer. Gerd is the director at the Max Planck Institute for Human Development in Berlin, and he’s a former professor of psychology at the University of Chicago. Gerd is also the director of the Harding Center for Risk Literacy. That would be the David Harding Center for Risk Literacy.

Our conversation today is about heuristics, and for those of you that trade, for those of you that invest, and for those of you that just want to navigate risk and uncertainty in your life, my conversation with Gerd is intriguing. Frankly, I find it fascinating. I find the topics that he goes into fascinating. His work is the foundation, the philosophical foundation, of trend following success, even if he did not set out for that to be the case. I hope you enjoy this conversation.

Hi Gerd, this is Mike Covel. How are you?

GERD: Yeah, I’m fine, so far. There’s no snow yet.

MICHAEL: Where are you today?

GERD: I’m in Berlin, at the Institute. And you?

MICHAEL: I am right outside Washington, D.C. right now.

GERD: Okay.

MICHAEL: Let me jump right in, Professor. I think your work and what you’re doing – in my world, in my trading world, much of your work is the foundational spine, so to speak, on good trading and the good philosophy behind good trading. I think where I’m going with that is, for example, the big question: how do we all make inferences about the world that we live in with limited time and knowledge? I mean, that gets right at where you start in with your work, doesn’t it?

GERD: Yes, totally right. But it’s not what most economists are talking about, and where the uncertainty is confused with risk, with known risks, and where one tries to model people’s behavior as if they could calculate all the risks. What I’m doing is to try to develop an alternative, or to find tools that actually can deal with uncertainty, and heuristics are some of these tools.

MICHAEL: One of the heuristics that I know you’ve used quite a bit in explaining this to all different types of audiences is the gaze heuristic, and specifically in sports, and specifically I’ve heard you use it in baseball. As a guy who once played baseball for a long time – and I was a baseball catcher – why don’t you explain that, though, from your perspective, and why that’s such a great way to lead into this subject?

GERD: The question is how does an outfielder catch a ball? If you look into explanations of that, you will find that many people think if the problem is complex, we need a complex solution. What would it be? Obviously, that the outfielder somehow calculates the trajectory of the ball and runs to the point where it’s going to go. You can find people like Richard Dawkins, in his Selfish Gene, writing exactly that. He puts the “as if” in there. This is an economics term. So behave as if one would be able to compute and know everything.

I’m interested in how actual people make decisions, here outfielders, and a number of experiments show that they don’t compute trajectories and don’t run to the point that has been computed, but rely on a simple heuristic that actually can do the job in the limited time, and also in a situation where one has not the information to estimate all the parameters you would need to determine the right trajectory, such as the initial distance, the initial velocity, the air resistance and the direction of the wind and spin and so on.

One of the simplest heuristics, among a number of heuristics that outfielders use, is the gaze heuristic. It works if the ball is already high up in the air, and it’s very simple. It consists of three building blocks. First, fixate your eye on the ball; second, start running; and third, adjust your running speed so that the angle of gaze remains constant. And then you will end up at the point where the ball ends up.

So here is a very different philosophy. There is a complex problem, and what’s needed is a simple solution, and when we look hard enough, we often can find it.

MICHAEL: In my world, and in some of the books that I’ve written on trading, one of the simple heuristics that people use is they say “How do I make a good investing decision? There’s so much information out there, there’s so many different variables; who do I make a good investing decision?” And many traders have figured out, why don’t we just focus on one piece of information?

For example, the price itself. So literally, if the price is going up, I want to be long with that instrument; if the price is going down, I want to be short with that instrument. So a lot of traders have really come at – whether they did it knowing about your work, or your work is the great foundation for why they’ve been successful – but this simple heuristic of using price action as a decision-making cue has worked extremely well.

GERD: Yeah. In trading, we have the same two philosophies: it’s a complex and difficult problem, and many are looking for complex solutions. You notice that it ranges from the traditional finance model, from Markowitz optimizing, to all kinds of computer-based and high, sophisticated calculation procedures. The other alternative is realizing this is not a problem of known risk. Trading is not trading in a casino. You trade in an uncertain world, and in an uncertain world, the optimization models will not necessarily work.

So we need something that’s robust, something simple, and the heuristic you described is a member of a class of heuristics that are called one reason decision-making. You try to figure out what’s the single most important reason, and then you ignore all the rest. It looks as if that would be irrational, but only if you believe that everything could be calculated. Many studies in social psychology, behavioral economics have tried to show that people just rely on one reason and ignore the rest, and they concluded that this is irrational.

But these studies overlook the important distinction between a world of risk and a world of uncertainty. In a world of risk, maignoring relevant information is irrational, or at least it’s a sign of it’s not so important, this problem. But not in a world of uncertainty – here, and what can be showed it mathematically, good decisions require to ignore part of the information, and if you try to make a complete pro/con list, then you will likely fail.

MICHAEL: As you bring up the world “fail,” I can think of an example that I’ve seen in your work that did not have much time for decision-making, and the pilot had to use various heuristics to make his decisions to save many lives, and that would’ve been the miracle on the Hudson River a few years ago. Why don’t you describe, through your work lens, why that’s such a great example to teach with?

GERD: The heuristics are often used unconsciously, like the gaze heuristic. If you interview a baseball outfielder, you will find out that most of them cannot say how they do what they do so well. So there’s more in our mind than we can describe. The frame heuristics can be used deliberately, and the miracle of the Hudson River is a case in point. As you will recall, the plane started to go from LaGuardia Airport, and within a few minutes, something totally unexpected happened: a flock of Canadian geese collided with the plane.

Now, the modern engines are built in a way to be able to digest birds, but not Canadian geese; they are too fat. The unlikely event happened that they flew in both machines, and it got very quiet in the plane, and the pilots turned around and had to make an important decision: will we make it to LaGuardia Airport, or will we hit the ground before it? That’s a decision about life and death. How did they do this?

One might conclude it’s a complex problem, so we need to compute the trajectory of this plane. But they didn’t. They used the same heuristic as the baseball outfielder uses, the gaze heuristic, now in a different situation. It means you look through the windshield, in the cockpit, and fixate the tower. And if the tower goes up in your windshield, you will not make it; you will hit the ground before. So here is an example of the same heuristic that many people use unconsciously now being used consciously, and it is more accurate than calculations, and in any case, much faster. So the pilots had time to do other important things.

It’s also a lesson that one can use this study of the heuristics people intuitively use in order to inform experts how to make better decisions. And it’s also an illustration that the common opposition between heuristics and conscious reasoning is wrong. So if you look in a famous book by Daniel Kahneman, you will hear a message about two systems. In one, they are heuristics and they are unconscious and they are error-prone. It’s not true. Every heuristic we have studied is used both unconsciously and consciously.

MICHAEL: It’s terribly fascinating. But your work – and I don’t think you have any problem with the controversy of this – you’re coming at these subjects of risk and uncertainty in a way that the establishment is not comfortable with. Many careers have been built off going a different direction than where you’re going. I relate very strongly to your direction, hence the reason I wanted to have you on my program.

GERD: That’s true. My work has caused many controversies. But that’s nothing bad. The science is there to discuss, to debate, to change, to improve. And the distinction between risk and uncertainty that is very fundamental to my research is not one that I came up with. You can find it in the work of the economist Knight in the 1920s; you can find it later. But it has not been taken seriously.

Moreover, most of intellectual effort has been put into building some ways and some methods that reduce uncertainty to known risks. Then we can use our probability measurements that we do. Probability is a wonderful instrument, and I rely on it, but it has its place. It’s not the only tool in our toolbox. And thinking that probability theory or Bayesian theory or any other tool is the only tool that can solve all problems would be like mistaking a hammer with an entire toolbox, and then thinking that everything in the world is a nail. It isn’t. There are screws, and you need different tools, like screwdrivers.

So the entire idea of an adaptive toolbox, of different strategies that the mind uses, is, I know, something that not many people are – or at least some people are not comfortable with, but it’s the same way our body is built. It has not one super organ, but it has many, and there’s a reason for that, because they work better.

MICHAEL: Let me slide into gut feelings. Because if I was to ask you, are you intuitive or rational, I think I know where you’re going to go, but how do you answer that when someone says “Gerd, are you an intuitive man or a rational man?”

GERD: It’s not an “or.” That’s a an error. It’s not an opposition, although you can read this again and again. We need both. We need our brains, we need our guts. More precisely, we need deliberate thinking, but also sometimes we need to trust our intuition. The only question is when. It’s not a question whether intuition is superior to deliberate thinking, or if deliberate thinking is superior to intuition, as many of my dear colleagues believe. No, that’s not the point.

You can show that good expertise is almost impossible without good intuitions. A composer needs intuition to compose. He or she cannot calculate the piece. A chicken sexer needs intuition – do you know what chicken sexing is? Chicken sexing is the art of finding out whether a one-day-old chicken is male or female. If you ask a chicken sexer how they do what they do, they cannot tell. It’s intuitive. But nevertheless, they can do this. And then there are other problems where it’s better to calculate, to do explicit pro and con lists.

And the unlucky attitude in much of social science is to put the one against the other one and look down at one of these. The intuition is based, according to my own research, often on simple heuristics. Why? Because intuition mostly has to do with real world problems that are characterized by uncertainty, not by known risk. If you play in the casino, roulette, you can calculate how much you will lose in the long run. You don’t need any intuition. But if you want to find out whom to trust, whom to marry, what job to take, what to do with the rest of your life, you can‘t calculate that.

Only parts can be calculated. There is a risk, and this is what we usually call experience. But it’s an experience that is not in language, that we cannot express, and this is why many people are suspicious of it.

MICHAEL: One of the things that I think is really interesting about your work is that you are tackling, at least in a way that people can understand and it can be useful then when it happens, is the element of surprise. And of course, you can’t necessarily prepare for a surprise; it’s a surprise. But why don’t you talk about surprise in your work and how you feel about it, what you’ve learned, and what’s useful for the audience to think about.

GERD: The moment one tries to reduce all forms of uncertainty to known risk, surprise is out of the question, because nothing can happen. Nothing new, at least. For instance, the financial crisis we recently had is an example why using standard models for estimating risk have overlooked every crisis and prevented none, and we’re surprised that things could happen that should not as easily happen. So dealing with uncertainty and devising and testing tools for uncertainty is also dealing with surprise. Also, the world needs flexible methods and flexible heuristics in order to adapt to new situations that are quite unexpected.

MICHAEL: When I was preparing for our conversation today, I was thinking one of the great lines that I’ve seen in your work is “the art of knowing what one doesn’t have to know.” Now, that’s slightly going back and talking about some of the issues we talked about in the beginning of this conversation, but the idea of knowing what one doesn’t have to know, the “less is more,” it’s a fairly – I think if most people think about it, it’s simple, it’s intuitive, this makes sense. It’s maybe common sense.

But as a society, we seem in many, many elements of our society – not just my world, for example, trading, but many elements of society, we have got to this point where complexity and more and more data has become overwhelming. I think people – for example, we have all these devices. We have the cell phones, we have the computers. The information overload never stops. And I really think a large number of the population think that all this extra information is helping, but if you just stop for a second and pause, you’ve kind of got to see yourself, “how does this help? It’s just distracting me.”

GERD: Yeah. This is part of the belief that we also talked about, that complex problems need complex solutions. It’s also a version of big data to hope that you can find your needle in the haystack just by having more. Not by knowing anything, but just having more computation. In an uncertain world, this is not correct, because you need to have sufficiently simple solutions in order to make better decisions.

Here is a very simple illustration of this, what we call “less is more” effect. A “less is more” effect is a situation where if you know more than a certain amount, your performance deteriorates, at least for some point. Dan Goldstein and I have studied how people answer simple trivia questions, such as “Which city has more inhabitants, Milwaukee or San Diego?” No, Milwaukee and Detroit, make it simple. About 60% of Americans that we have tested got the right answer, Detroit.

If you ask Germans, what we found is that 90% of the Germans got the answer right, not because they knew more, but because they knew less. Most of the Germans had not even heard of Milwaukee, only of Detroit, and they relied on a simple heuristic that we call the recognition heuristic. You’ve heard of Detroit, not of Milwaukee, so it’s probably Detroit. The Americans could not rely on this heuristic; they have heard of both, and they need to rely on the facts, on recall.

Here is a situation where less is more, where we can show and prove that a sufficient degree of ignorance can actually help. The art of knowing what you don’t need to know can be a conscious version of these type of heuristics that are used, usually unconscious, and realizing that in an uncertain world, the attempt to calculate everything is an illusion of certainty, and it will probably lead to failure.

Good decision-making in an uncertain world needs to find a balance between ignoring and knowing something, and this balance can be mathematically described by the so-called bias-variance dilemma, which I’m not going into it, and which shows us that when we need to make estimations, we should not try to fit every data point, but we have to ignore something and try to just be a little bit biased in order to make better decisions.

So in our work, bias has a positive meaning, and not the meaning that it has in much of work in social psychology, where every deviation from a so-called normative model is called a bias, and people are blamed for it. Intuition is probably more intelligent than this kind of argument by some social psychologists.

MICHAEL: Gerd, you brought up something here a second ago which I thought could lead to another example. I have two quick questions, and then we’ll wrap up. But the idea of recognition, and I think in your work – I can’t remember if it’s an experiment that you did or something that you covered, but the idea of people selecting stocks for investing purposes and the recognition. Why don’t you go into that example? I think that was really interesting.

GERD: That’s actually a study I did with some of my colleagues a long time ago. You’ll find it in the Simple Heuristics Make Us Smart book. We wanted to test how good a very simple heuristic is that can only be used by semi-ignorant people when picking stocks. In order to use this recognition heuristic, you need semi-ignorant people, so we went in downtown Chicago, asking pedestrians which of a long list of stocks they recognized by name, nothing more. Then we did the same thing in downtown Munich.

Then we had pedestrians and we had business students, and then we built portfolios of say the 10 most recognized stocks, and as a control portfolio, the least recognized. Then we waited a half a year; that was the criteria. Then we looked how much money did they make compared to randomly picked stocks, to a certain number of well-known blue chip firms, and some experts and all kinds of criteria.

The study showed that the recognition heuristic portfolios, based on the semi-ignorance of pedestrians, made most money. We replicated the study a few times in contests that have been defined by stock journals and others, and found similar results. Now, that will not always replicate, but I would venture that this simple heuristic does at least as good as well-known people, professional stock pickers, the Dow Jones or other indexes.

That’s an illustration that you can, in a highly complex and uncertain world, you can actually do quite well by simple heuristics – in this case, interestingly, heuristics that need semi-ignorant people. Not totally ignorant, because they need to recognize some half of the – so it should not be read that the less you know, the better. No. A “less is more” effect is defined in a different way. There’s usually some time where more knowledge helps, but then there will be a tipping point, where more knowledge will lead you astray.

MICHAEL: I think there’s something – it’s almost a pejorative in some ways, from my perspective, when I hear the word “simple” associated with your work, because for you to come to the position and draw some of the inferences and conclusions in your work with your associates, you’ve had to explore all of the other theories and whatnot that’s come before, and then come out on the other side with a viable perspective that makes sense.

I want to lead, as a last example, a last question: a few years ago, I had the chance to spend the afternoon with Harry Markowitz, who is obviously, for most people in the finance world and in their college textbooks, the name is quite familiar. But why don’t you go ahead just briefly and maybe describe for the audience Markowitz’s findings, and then maybe why don’t you add some critique from your perspective on perhaps ways that he might’ve got it wrong?

GERD: First, Markowitz didn’t get it wrong. He developed an optimization model called the mean-variance portfolio that works if the assumptions of his model are in place. But the claim that the Markowitz model would work in the real world of finance is a different one. If someone makes this claim, then it’s likely that the person gets it wrong.

Interestingly, when Harry Markowitz made his own investments for the time for his retirement, he used his Nobel Prize winning optimization method, so we might think? No, he did not. He relied on a simple heuristic: divide your money equally. So if you have two options, 50/50. If you have three, a third, a third, and so on, 1/N. There is not much computation involved.

The question now is, how good is 1/N, a simple heuristic, no computation, no free parameters, relative to the Markowitz optimization model? There are a number of studies. What most of the studies show is that in the real world of investment, 1/N typically leads to better returns than Markowitz optimization. Typically means. And the real question is, can we identify these situations where the simple heuristic does better, and the situation where the Markowitz model would do better?

To the best knowledge today, these situations are the higher the uncertainty, the better for the heuristic. Second, the larger the number of “N,” so the assets that you have, the better for the heuristic. If a small number, that is where the Markowitz model is profiting because it doesn’t have to estimate so many parameters. And finally, third, the larger the sample size, the better for the Markowitz model.

So then you can ask, given the answer of the stock market, and if N = 50 assets, for instance, how many years of stock data would you need in order so that you can likely argue that the Markowitz model finally does better? There are some simulations out there that have tried to answer this question, and the answer is roughly 500 years. So that means in the year 2500, we can start to distrust our intuitions, like 1/N, and do the computations, provided the same stocks are still around in the stock market in the first place.

MICHAEL: Gerd, once you got into this subject area, looking at risk, looking at uncertainty, doing the experiments, doing the research, this has just become a lifelong project for you. I mean, you are 100% passionate. You just love this, don’t you? I love it; I think it’s awesome. But I mean, you’ve got to just love it.

GERD: I do. And also, I learn so much about the world. For instance, this basic research about how to deal with uncertainty is relevant for so different fields that it not only includes finance, but it includes management. It also includes the law. I have taught about 50 or so American federal judges in decision-making. It includes healthcare. I have personally taught about 1,000 German doctors in order to understand risks. So I probably learn as much as these experts learn from my own work about how the world is functioning.

And there are situations where you need to teach people how to understand known risks, so risk communication, which doctors still don’t learn in their medical education. And then there are other situations where you need to teach them how to use heuristics, and also the difference between risk and uncertainty. And at the end, also teach them there is not just one method out there, but we have to deal with the world; we have to have a toolbox, which we call the adaptive toolbox, in order to make better decisions.

It’s true; it has been always much fun in my life, and I continue to learn new things about different areas in my environment, including the psychological factors of decision-making, such as defensive decision-making. If you go to your doctor’s and you believe that your doctor gives you the best advice, you may be the lucky outlier.

But most doctors in the U.S. practice defensive decision-making; that is, they suggest to you something that’s the second or third best for you, but which protects themselves from being sued by you. So that’s called defensive decision-making. You treat your clients different from, say, your relatives. In one study, 93% of all American doctors said, “Yes, I do defensive decision-making. That is, I do not advise the best thing for my client.”

MICHAEL: That’s frightening, isn’t it?

GERD: That’s frightening, but it’s important for everyone to know. And you cannot blame your doctor for doing that, because you are the one who is suing. But you have to understand the system. And there are some simple heuristics, then, that may help you.

For instance, when my mother got blind on one eye, and there was an experimental therapy, so I called up the person who has done most of these experimental therapies and explained the case and asked him, “What do you think? What would you recommend for my mother?” He said, “Just try the therapy.” Then I realized I asked the wrong question, because he will be in a defensive position. I could sue him.

So I asked him again, I said, “Look, I have only one mother. If it would be yours, what would you do?” He said, “I wouldn’t do anything.” His mother wouldn’t sue him, so he’s cautious. This kind of heuristic, don’t ask your doctor what you should do, but ask him or her…

MICHAEL: There’s a classic chapter in Stephen J. Gould’s book, Full House, where he talks about surviving cancer, and how he put aside doctor advice and looked at the bell curve, so to speak, for his own situation. I think that’s what you’re saying here, is take the power into your hands and don’t just trust.

GERD: Yeah, and also realize that the doctor is in a defensive position, so ask the doctor not what he would recommend, but what he would himself do, or what he would recommend to his own mother, in this case.

So there are a number of psychological factors that are also very important in order to understand decision-making and risk. There will be a book that’s coming out in April next year, so April 2014, called Risk Savvy, where much of this, what I’m just saying, is contained and elaborated. So if you’ve nothing better to do, then just get a copy and have a nice evening.

MICHAEL: Listen, I know people might be saying “Mike, you’ve got a professor of risk and uncertainty on the show,” but I’m going to go find it: I hear there is an interesting video of you on YouTube somewhere, and I believe it’s a commercial, and I believe music is involved. So I’m going to go find it. This is true, right?

GERD: Yes, this is from an earlier career. You asked about my life; my life has not been just about studying risk and uncertainty. I had another career. I was a musician in the entertainment business, and the video you are referring to was the first TV spot for the VW Rabbit, called at this time. And it’s actually an American spot. I had a band at this time, and we won the contest for the TV spot, and I’m on the steering wheel with the banjo, in case you don’t recognize me. That’s a long time ago.

MICHAEL: Well, I’m going to go find it. Hey Gerd, the best place for people to reach out and check into your work, if they want to reach out to you – I believe it’s the Max Planck Institute at Berlin. Is that the best place for people to go?

GERD: On the internet, you mean? Or the website, yes?

MICHAEL: Yeah.

GERD: But also it depends on what you are doing. The Risk Savvy book is my third trade book. There’s another one called Gut Feelings and there’s another one called Calculated Risk before. But there are also many academic books which you can easily find just typing in my name, “Gigerenzer,” and then you will find lots of things. Lots of interesting things to read and think and challenge you.

What I hope is, at the end, to make the theme of decision-making and risk something that is less abstract and less remote from what real experts and people in daily life that’s deemed as classic decision-making theory, which is about calculating probabilities and utilities. In most cases, we cannot calculate the probabilities, nor do we know the utilities, so we need something else. I hope there is a viable alternative that brings the academic science in more contact with the professions that really have to deal with uncertainty.

MICHAEL: I appreciate your time today, and hopefully when that new book comes out in the spring, I can have you on and we can discuss it.

GERD: Okay, we’ll do this.

MICHAEL: Thank you very much. Have a good day today, Gerd.

GERD: Yeah, it was a great pleasure to talk to you, Mike. Bye bye.

MICHAEL: Thank you. Take care.
[/toggle]

Rate and Review Trend Following Radio on iTunes

Join over 12,000 others on the Trend Following mailing list.

Have a question or comment about this episode? Post it below.