Skip to main content

memory failures

The fallibility of human memory

I don't often talk about eyewitness testimony, but it's not because of the lack of research. It's a big field, with a lot of research done. When  I say I don't follow it because I regard the main finding as a done deal - eyewitness testimony is useless - that's not meant to denigrate the work being done. There is, clearly, a great deal of value in working out the exact parameters of human failures, and in working out how we can improve eyewitness testimony. I just arbitrarily decided to ignore this area of research until they'd sorted it all out! (I can't follow everything, I'm swamped as it is!)

Nevertheless, I do want to remark on a recent report in The Scientist, to the effect that a New Jersey court has decreed that all juries must be informed of the unreliability of eyewitness testimony. I want to raise a hearty cheer. I regard it as practically criminal that eyewitness testimony is given the weight it is. I think everyone should be taught, from a young age, that memory is completely unreliable. And, in particular, that the certainty you hold in any specific memory, and the vividness it has, are not nearly as good proofs of the accuracy of the memory as we tend to believe.

You may think a belief in the fallibility of memory would create an unpleasant state of uncertainty, but I believe it would bring about a useful decline in many individuals' dogmatic certainty, and encourage more empathy with other, fallible human beings.

You may ask how my emphasis on the fallibility of human memory squares with my frequent comments on the danger of believing that you have a bad memory or that your memory will inevitably get worse as you age. But believing in human fallibility is very different from believing you personally have a bad or deteriorating memory. You need to find a nice balance between these beliefs, and part of achieving that lies in understanding how memory works and what aspects are more reliable and which less. I hope my site helps you with that!

Normal is a label too

We all like simple solutions. However much we may believe we are ‘above’ black-&-white dichotomies, that of course we understand that every situation is complex, nevertheless we have a brain that can only think of a very very few things at once. So it's unsurprising that we are drawn to solutions that can be summed up simply, that can fit comfortably within the limitations of working memory.

Here’s something I read about in Scientific American the other day: Huntington’s disease — which is a terrible disease that eats away at your brain, causing both physical and cognitive disabilities that continue to deteriorate until the sufferer dies an untimely death — is linked to an excess of a brain chemical (the neurotransmitter glutamate) that is in fact vital for learning and memory. Intriguingly, a recent study has found that those with the genetic mutation for this disease, but who were as yet asymptomatic, were significantly quicker to learn than those without the mutation. Indeed, those with the greatest number of copies of the mutation were the fastest to learn.

This may not simply be a matter of disease progression — an earlier study found that Huntington’s patients did better on one cognitive task than healthy controls (detecting whether a tone was long or short). It may be, the researchers suggested, that it is simplistic to talk of a decline in cognitive function in Huntington’s, rather some functions might be enhanced while others are impaired.

Huntington’s Disease is hardly alone in this. We often talk about ‘normal’ memory aging, and there’s no denying the concept of normal is a useful one — in certain contexts. But not, perhaps, in all those contexts in which it is used.

Psychology, as I’ve mentioned before, has historically been a search for what is ‘normal’ in human behavior. What is not normal is deemed ‘abnormal’ — a nice black & white dichotomy. Of course this is simplistic, but it gives us a place to stand. But now that psychology is a mature discipline, it can look around, explore the variability in human behavior. However it is only very very recently that we have begun to realize that the search for normal is merely a starting point to the question of what it is to be human, and it has become a straight-jacket.

As an example, let’s look briefly at something discussed in a provocative article about autism that appeared in the journal Nature. The writer of the article, Dr. Laurent Mottron, leads a research team that includes several autistic individuals. As a consequence, he has grown to appreciate the strengths that such individuals can bring to research.

The main thrust of his argument is that autism is not simply a “suite of negative characteristics”. There are advantages in some autistic characteristics. But because autism is defined as a ‘disorder’, researchers and clinicians systematically interpret behaviors in terms of impairment and abnormalities. More useful would be to examine each behavior on its own merits, and consider whether the behavior might be adaptive in certain contexts.

Mottron says that although intellectual disability is routinely estimated to be about 75% among autistics, only 10% of autistics have an accompanying neurological disease that affects intelligence, and if researchers used only those tests that require no verbal explanation, the level of intellectual disability would be seen to be much lower. An interesting comparison: “In measuring the intelligence of a person with a hearing impairment, we wouldn't hesitate to eliminate components of the test that can't be explained using sign language; why shouldn't we do the same for autistics?”

Mottron’s research term have coined a telling word: normocentrism, meaning the preconception you have that if you do or are something, it is normal, and if autistic do or have it, it is abnormal — I think this term could be usefully applied more widely. Similarly, the rise of the concept of ‘neurodiversity’ in the autistic community (whereby a ‘normal’ is ‘neurotypical’ and someone with an autism spectrum disorder is ‘neurodiverse’) could also be applied more widely. Rather than distinguishing between the two types, we should see human diversity as represented by a spectrum, where ‘neurotypical’ covers a wide middle range, and other ‘disorders’, such as autism, dyslexia, and ADHD, similarly occupy a range along the spectrum.

Because this is the point, this is what research has been revealing over the past few years: there is no such ‘thing’ (as in a single thing) as autism, as dyslexia, as ADHD, as Alzheimer’s. They all have multiple variants — variable characteristics; variable causes. Because they reflect subtly different differences in the brain.

Which means we shouldn’t assume that because something has a label (“Alzheimer’s”), there is only one path (and relatedly, one set of dangerous factors). For example, we ‘know’ that high blood pressure is bad, and certainly it’s an important risk factor for cardio- and cerebro-vascular disorders (including Alzheimer’s). And yet, according to a recent study, this is not the complete story. For the very elderly (say, 85+), high blood pressure may be a sign of better health. This isn’t just because risk factors are worked out on the basis of group studies while you are an individual (there is always individual variation). It’s also because almost everything has trade-offs. Like the Huntington’s disease gene variant that improves learning. Like the neurodiverse who have exceptional visual skills.

Similarly, just because someone has put a label on you (“dyslexic”), you shouldn’t assume that means that everything you know about dyslexia applies to you. Nor should you assume that there are no positives about your particular brain.

In the same way, you shouldn’t assume that being a ‘genius’, or having a ‘photographic memory’, is all positive. Everything is a trade-off (don’t mistake me, I’m not suggesting that there is something positive about Alzheimer’s! but it may be that humans are vulnerable to Alzheimer’s because of our superior brains, and because we live so long).

The message is, don’t simply fall prey to a label. Think about it. If you or someone you care for has been labeled, focus on the individual manifestations, not the label. The label is a guide — treat it as one. But never forget that each manifestation will have its own particular characteristics, some of which may be positive.

And 'normal' is a label, too. Here's an issue that's only recently become realized in the cognitive research community: our idea of what is 'normal' is largely based on one cultural group. Most cognitive research has been undertaken on American undergraduate students (according to a recent analysis, 96% of research subjects in a sample of hundreds of psychology studies came from Western industrialized countries, and 68% came specifically from the U.S. — of these, 67% were psychology students). In recent years, it has become evident that WEIRD people (those from Western, Educated, Industrialized, Rich, and Democratic societies) respond significantly differently on a whole lot of domains compared to non-Western groups — even on something as seemingly basic as a visual illusion. (see Scientific American for a nice discussion of this)

As I said at the beginning, our preference for simple solutions and simple labels is rooted in our limited working memory capacity. The only real way around this is to build up your knowledge piece by piece, so that the items in working memory are merely the tips of richly elaborated items held in long-term memory. That isn't quick or easy, so there'll be many areas in which you don't want to gather such elaborated knowledge. In the absence of being able to stretch the limits of working memory, it helps to at least be aware of what is limiting your thinking.

In other words, as with memory itself, you need to think about your own goals and interests, and choose those that you want to pursue. Becoming expert (or at least, a little bit expert!) in some areas shows you how different your thinking is in those areas; you will then be able to properly appreciate the limitations in your thinking in other areas. That’s not a bad thing! As with memory failures of other kinds, it’s a big step just to be aware of your fallibilities. Better that than to be fooled (as some experts are) into thinking that their expert thinking in one area means that they think equally clearly in other areas.

We are all fallible. We all forget. We all have false memories and believe in them. We all sometimes fall victim to labels. The trick is to realize our fallibility, and choose the occasions and domains in which to overcome it.


Mottron, L. (2011). Changing perceptions: The power of autism. Nature. 479(7371), 33 - 3