Comments Off on It’s useful to forget (some) things.

It’s useful to forget (some) things.

Over on Twitter, the excellent Tom Chivers (find his book here) is curious about how the human brain works: “Is there a specific name for the cognitive bias that makes us remember weather forecasts the few times they’re wrong… but not all the times when it’s right?”

Good question, and one I embarrassingly can’t think of a satisfactory answer to. This is not my area of behavioural science (I tend to do more typical econ stuff), but I can offer a couple of half-thought-out suggestions.

The first is that we react more strongly to negative information. It’s easier to remember negative shocks, and there’s probably quite a good evolutionary basis for that; if there are several pretty good options in front of you, it’s less important that you choose the best than it is that you don’t choose the really bad one (1).

The second is that weather forecasts are usually correct. The memory is less like a video recorder — everything goes in and is kept until it gets taped over — than it is a notebook; we decide what to remember and we actively overwrite and forget things. This is useful; we don’t need to recall every detail of an event so much as we do the shape, because that’s what lets us fit new events against existing information. Part of this process is discarding things which fit well against existing information; we retain outliers and novel experiences because they attract our attention while we process new information.

So when the weather forecast is correct, as it usually is, and we planned on it being correct, there’s not much new there to remember, and there’s no negative experience to learn from. When we went out and bought sausages for a barbecue that can’t go ahead because it’s chucking it down, that sticks.

A related process of streamlining the storage of information also might also explain an oddity I’ve observed in my own behaviour. If you’ve ever sat exams on two related subjects, you may have noticed the odd sensation that you aren’t drawing the connections between them. While you may cover some of the same topics, the information is effectively siloed. This is probably an example of functional fixedness; when we think of ways to use an object, we tend to limit our attention to the things we’ve seen it used for before.

This process seems to be something that happens to conceptual information as well; unless we’re very used to applying a concept generally, there’s a risk that we will store it as field-specific information and won’t have it come to mind when we consider a problem in a different context.

It also sometimes seems to me that examples of seeming hypocrisy in someone’s values or beliefs have arisen through a similar process; problem one was encountered in context A, problem two in context B, and the reasonable application of the processes which came to mind first resulted in answers which don’t quite match. The problem is not dishonesty, but poorly integrated thinking.

Update 13/06/2020: Amusingly one of the concepts I wished to note as something with field-specific recall was the Gell-Mann amnesia effect. Unfortunately, I forget it.

Image courtesy of duncan c, used under a Creative Commons licence

  1. It often seems to me that most cognitive biases are effectively things which were adaptive on the savanna but are less so in modern societies; default bias (the tendency to want to keep things as they are) makes perfect sense if you’re hanging around on a nice patch of grassland; you’re alive and it’s tolerable, while change might be bad. Similarly, the gambler’s fallacy (believing a repeated run makes a further repetition less likely) is probably quite a good heuristic in a world where events are generated by natural processes with a long reset time.