Skip to main content

Aging

Why learning is harder as we get older

Children learn. It’s what they do. And they build themselves over the years from wide-eyed baby to a person that walks and talks and can maybe fix your computer, so it’s no wonder that we have this idea that learning comes so much more easily to them than it does to us. But is it true?

There are two particular areas where children are said to excel: learning language, and learning skills.

Years ago I reported on a 2003 study that challenged the widespread view that young children learn language more easily than anyone older, in regard to vocabulary. Now a new study suggests that the idea doesn’t apply to grammar-learning either.

In the study, 24 Israeli students aged 8, 12, or 21, were given ten daily lessons in a made-up language. A rule in the language — not made explicit to the students — was that verbs were spelled and pronounced differently depending on whether they referred to an animate or inanimate object. In the lessons, the students were asked to listen to a list of correct noun-verb pairs, and then say the correct verb when given further nouns. Two months later, the students were tested on what they remembered.

The young adults were significantly faster at learning and more accurate than the other groups. Moreover, the 8-year-olds never succeeded in transferring the rule to new examples (even when they were given additional training, with the rule made more obvious), while most 12-year-olds and adults scored over 90%, with the adults doing best. It’s also noteworthy (given popular belief) that children's pronunciation was inferior to that of older subjects.

The findings point to the importance of explicit learning, as well as indicating that language skills are not reduced post-puberty, as has been suggested. So why does it seem more difficult for most adults to learn a new language? The problem may lie with interference from the native (or indeed any other) language.

I’ll get back to that. Let’s move on to the related question of procedural memory, or skill learning.

Here’s a study in which we learn something truly fascinating about interference. In the study, 74 young people (aged 9, 12, and 17) were trained on a finger-tapping task, then tested on the two following days. Some of the participants were further tested six weeks later. In a second experiment, 54 similarly-aged people had the same training, but also given an additional training session two hours later, during which the motor sequence to be learned was the reverse of that practiced in the initial session. They were then tested, 24 hours later, on the first sequence.

In the first experiment, all age-groups improved steadily during training, in both speed and accuracy, and showed jumps in performance when tested 24 hours later (such jumps are typical in procedural learning and are referred to as ‘off-line gains’; they are assumed to reflect memory consolidation).

These jumps were maintained or improved at 48 hours, and six weeks. The gains were the same for each age-group, but there was a clear difference between the groups in terms of their starting point, with the older ones performing noticeably better initially. Because the effect of practice was the same for all, the performance difference between each age-group was the same at each point in time.

It is worth emphasizing that performance six weeks after the experience was the same, and sometimes better, despite the lack of practice over that time.

So these results challenge the view that children have an advantage over adults in terms of learning skills, and also demonstrate that children improve “off-line” as adults do, indicating that they too have an effective consolidation phase in motor memory.

But the second experiment is the really interesting bit. You would expect, if you learned one sequence and then learned the reverse, that this would interfere badly with your memory for the first sequence. And so it did, for the 17-year-olds. But not for the 9- and 12-year-olds, who both showed a performance gain at 24 hours, as seen in the first experiment.

Moreover, the better the 17-year-olds became at the reverse sequence, the worse their performance on the initial sequence at the 24-hour test (as you’d expect) — but for the 12-year-olds, the better they were on the reverse sequence, the better they did on the first sequence at the 24-hour test.

What does this mean? Why didn’t interference occur in the pre-pubertal children?

It appears that the consolidation occurring in children is different in some way from that occurring in adults.

There are several possibilities. It may be that the consolidation process becomes, post-puberty, more selective. In the situation where there are several different experiences, priority is given to the more recent. It may also be that consolidation simply occurs faster in children.

One mechanism of change may occur through sleep. The structure of sleep changes during puberty, and we don’t yet know whether consolidation occurs during sleep in children as it does in adults. Another is competition for neural resources (transcription and protein synthesis related factors) during consolidation. It has been suggested that this “competitive maintenance” only fully matures at puberty.

On the other hand, it may have to do with the effects of experience. Interference only occurs when tasks overlap at some point. If children are representing the movement sequences in a more specific, less abstract, way than adults, the sequences may be less likely to use the same neurons (e.g. adults are learning a rule; children are learning two different ways of moving particular fingers). Accordingly, training on the reverse sequence provides additional training in the art of moving these fingers in this way, but doesn’t interfere because the pattern is not the same.

Interference is the bug-bear of learning. Interference may be the key to why learning gets harder the older we get — despite a number of advantages. So let’s explore this a little more.

Here’s a small study in which 14 young adults (average age 20) and 12 older adults (average age 58; range 55-70) learned a motor sequence task requiring them to press the appropriate button when they saw a blue dot appear in one of four positions on the screen. The training included several learnable sequences interspersed with random trials. Participants, however, were not informed of this. There were three blocks of trials during the first session (separated by a 1-2 minute rest), and a fourth block on the second session, 24 hours later.

As expected, younger adults were notably faster in their responses than the older group. Less expected was the fact that the older group showed markedly greater improvement on the learnable sequences than the younger group. However, on the second session, while the younger adults showed the expected off-line gain in performance, indicative of consolidation, the older adults performed at the same level as they had early in the first session.

It should be noted that the average reaction time of the older group in the very last session matched the reaction time of the younger group in the first sessions, demonstrating that, while we may slow down with age, we can counter that with training. The fact that the older adults were noticeably better at learning the sequences may reflect the increases in activation seen in motor regions in normal aging, possibly compensating for decreased activation and atrophy in the hippocampus.

But what’s interesting in this context is this lack of off-line gain.

The same thing was seen in another study comparing younger and older adults, which found that, while the older adults showed improvement in general skill on an implicit sequence-learning task after 12 hours, this improvement had disappeared at 24 hours. Nor was it seen at one week.

So why aren’t these memories being consolidated in the older adults?

(This is not to say that all benefit of the earlier training was lost — the improvement over the second session indicates that some memory was retained. So it may be — and is consistent with what we know about the effects of training in older adults — that more, and perhaps longer, training sessions are needed before older adults can properly consolidate new learning.)

Is this because we become slower to consolidate with age? This harks back to the idea that children suffer less interference because they can consolidate memories more swiftly.

Or perhaps it has to do with the greater interference attendant on the brains of older adults being more richly-connected. A computer model mimicked a decline in language learning as a function of the growth in connectivity in the neural network. This computational model suggests that once connectivity in the parts of the brain responsible for procedural memory slows, learning suffers increasingly from first-language interference.

It may be, of course, that both processes are going on. Greater interference, and slower consolidation.

It may also be that the adult brain becomes more selective in the making of long-term skill memory.

It may also be that these (and other) changes in the adult brain lead to more interaction between information-sets that are further apart (see my recent news item on preventing interference). Thus, if you learn something at ten in the morning, and something else at twelve, your brain can, and will, try to relate the two (which can be good or bad). A child’s brain can’t stretch to encompass that. They would need to be explicitly reminded of the first lesson.

I suspect that all these factors are important, and point to ways in which we should approach learning/teaching differently for pre-pubertal children, young adults, and older adults.

In the case of older adults, it is clear that we need to provide the optimal conditions for consolidation.

I have talked repeatedly about the value of spaced training, distributed training, interleaved training. So it’s interesting to note that studies have found that consolidation of motor memories occurs differently depending on whether training occurs in blocks (each sequence mastered before learning another one) or on a random schedule involving all sequences.

Off-line learning is better when motor skills are learned under a random practice schedule. While blocked practice produces better immediate learning, random practice produces better delayed learning. It appears that a random schedule generates activity across a broad network involving premotor, parietal, sensorimotor and subcortical regions, while learning under the blocked schedule is limited to a more confined area (specifically one particular part of the motor cortex).

This suggests that interleaved practice is even more important for older adults. Although it slows down initial learning (which, remember, was better for older adults compared to younger, so there’s leeway there!), spreading the load across a broader neural network is especially important for those who have some atrophy or impairment in specific regions (as often occurs with age).

Judicious resting during learning may also be of greater benefit for older adults. Consolidation occurs most famously during sleep (and let’s not forget how sleep changes in old age), and also occurs to a lesser level while awake, within a few hours of training. But there is also evidence that a boost in skill learning can occur after rests that only last a few minutes (or even seconds). This phenomenon is distinguished from consolidation (it’s called ‘reminiscence’), because the gains in performance don’t usually endure. However, while in some circumstances it may simply reflect recovery from mental or physical fatigue, in others it may have a more lasting effect.

 Evidence for this has come from learning in music. A particularly interesting study involved non-musicians learning a five-key sequence on a digital piano. It found that even 5-minute rests during learning could be beneficial, but only if they occurred at the right time.

In the study, the participants repeated the sequence as fast and accurately as they could during twelve 30-second blocks interspersed with 30-s pauses. A third of the participants had a 5 minute rest between the third and fourth block, while another third had the rest between the ninth and tenth block, and the remaining third had no rest at all. Everyone was re-tested the next day, around 12 hours after training.

Participants showed large improvements during training after either 5-minute rest. However it was only those who were given a rest early in the training that continued to show improvement throughout the training. That is, even though the late-rest group matched their performance on block 10, after this ‘jump’ their performance fell on blocks 11 and 12, while the performance of the early-rest group continued to climb after their jump (at block 4). This group also showed the greatest off-line gain. That is, their performance ‘jumped’ more than that of the other two groups when tested on the following day.

In other words, consolidation was affected by the timing of the rest.

Among the late-rest and no-rest groups, improvement during blocks 4-9 was not as rapid as it had been during the first three blocks. This is a typical pattern during motor learning. It may be, then, that resting early allows processes triggered by repetition to develop fully, rather than becoming attenuated through too much repetition. Thus resting early in practice may allow the faster rate of learning to continue for longer. This in turn results in greater repetition before practice ends, leading to a more stabilized (short-term consolidated) memory, and thus greater overnight (long-term) consolidation.

On the other hand, the short-lasting gain achieved by the late-rest group didn’t affect later learning, but did predict the extent to which performance improved after sleep.

Other improvements to learning may come from reducing interference, and taking cognizance of greater selectivity. In the realm of language learning, for example, it’s argued that successful long-term learning in adults is more and more dependent on explicit learning, declarative knowledge, and its automatization. It may be that, for adults learning a second language, greater importance should be placed on explicit comparison with the native language.

It also seems likely that immersion in the new language is more important for adult learners. The problem is that every time you return to your native language, you’re encouraging interference (something to which, as we have seen, children may be far less susceptible).

In sum, as we get older, interference becomes more of an issue. To counter this, we need to be more thoughtful about planning our learning.

 

For more about the recently reported research into the difference between children's and adults' language learning, see

http://www.newscientist.com/article/mg21128224.000-age-no-excuse-for-failing-to-learn-a-new-language.html

http://blogs.edweek.org/edweek/inside-school-research/2011/07/study_older_students_may_learn.html

References

Brown, R. M., & Robertson E. M. (2007). Off-Line Processing: Reciprocal Interactions between Declarative and Procedural Memories. The Journal of Neuroscience. 27(39), 10468 - 10475.

Brown, R. M., Robertson E. M., & Press D. Z. (2009). Sequence Skill Acquisition and Off-Line Learning in Normal Aging. PLoS ONE. 4(8), e6683 - e6683.

Cash, C. D. (2009). Effects of Early and Late Rest Intervals on Performance and Overnight Consolidation of a Keyboard Sequence. Journal of Research in Music Education. 57(3), 252 - 266.

DeKeyser, R., Monner, D., Hwang, S-O, Morini, G. & Vatz, K. 2011. Qualitative differences in second language memory as a function of late learning. Presented at the International Congress for the Study of Child Language in Montreal, Canada.

Dorfberger, S., Adi-Japha E., & Karni A. (2007). Reduced Susceptibility to Interference in the Consolidation of Motor Memory before Adolescence. PLoS ONE. 2(2), e240 - e240.

Ferman, S., & Karni A. (2010). No Childhood Advantage in the Acquisition of Skill in Using an Artificial Language Rule. PLoS ONE. 5(10), e13648 - e13648.

Ferman, S. & Karni, A. 2011. Adults outperform children in acquiring a language skill: Evidence from learning an artificial morphological rule in different conditions. Presented at the International Congress for the Study of Child Language in Montreal, Canada.

Karni, A. 2011. A critical look at ‘critical periods’ in skill acquisition: from motor sequences to language skills. Presented at the International Congress for the Study of Child Language in Montreal, Canada.

Nemeth, D., & Janacsek K. (2010). The Dynamics of Implicit Skill Consolidation in Young and Elderly Adults. The Journals of Gerontology Series B: Psychological Sciences and Social Sciences. 66B, 15 - 22.

Robertson, E. M., Press D. Z., & Pascual-Leone A. (2005). Off-Line Learning and the Primary Motor Cortex. The Journal of Neuroscience. 25(27), 6372 - 6378.

Stambaugh, L. A. (2011). When Repetition Isn’t the Best Practice Strategy: Effects of Blocked and Random Practice Schedules. Journal of Research in Music Education. 58(4), 368 - 383.

Steele, C. J., & Penhune V. B. (2010). Specific Increases within Global Decreases: A Functional Magnetic Resonance Imaging Investigation of Five Days of Motor Sequence Learning. The Journal of Neuroscience. 30(24), 8332 - 8341.

Wymbs, N. F., & Grafton S. T. (2009). Neural Substrates of Practice Structure That Support Future Off-Line Learning. Journal of Neurophysiology. 102(4), 2462 - 2476.

Decision-making, working memory, and age

In October I reported on a study that found older adults did better than younger adults on a decision-making task that reflected real-world situations more closely than most tasks used in such studies. It was concluded that, while (as previous research has shown) younger adults may do better on simple decision-making tasks, older adults have the edge when it comes to more complex scenarios. Unsurprisingly, this is where experience tells.

Last year I reported on another study, showing that poorer decisions by older adults reflected specific attributes, rather than age per se. Specifically, processing speed and memory are behind individual differences in decision-making performance. Both of these processes, of course, often get worse with age.

What these two studies suggest is that your ability to make good decisions depends a lot on whether

  • you have sufficient time to process the information you need,
  • your working memory is up to the job of processing all the necessary information, and
  • your long-term memory is able to provide any information you need from your own experience.

One particular problem for older adults, for example, that I have discussed on many occasions, is source memory — knowing the context in which you acquired the information. This can have serious consequences for decision-making, when something or someone is remembered positively when it should not, because the original negative context has been forgotten.

But the trick to dealing with memory problems is to find compensation strategies that play to your strengths. One thing that improves with age is emotion regulation. As we get older, most of us get better at controlling our emotions, and using them in ways that make us happier. Moreover, it appears that working memory for emotional information (in contrast to other types of information) is unaffected by age. Given new research suggesting that decision-making is not simply a product of analytic reasoning processes, but also involves an affective/experiential process that may operate in parallel and be of equal importance, the question arises: would older adults be better relying on emotion (their ‘gut’) for decisions?

In Scientific American I ran across a study looking into this question. 60 younger (aged 18-30) and 60 older adults (65-85) were presented with health care choices that required them to hold in mind and consider multiple pieces of information. The choices were among pairs of health-care plans, physicians, treatments, and homecare aides. Working memory load increased across trials from one to four attributes per option. On each trial, one option had a higher proportion of positive to negative attributes. Each attribute had a positive and negative variant (e.g., “dental care is fully covered” vs “dental care is not covered”).

In the emotion-focus condition participants were asked to focus on their emotional reactions to the options and report their feelings about the options before making a choice. In the information-focus condition, participants were told to focus instead on the specific attributes and report the details about the options. There were no such instructions in the control condition.

As expected, working memory load had a significant effect on performance, but what’s interesting is the different effects in the various conditions. In the control condition, for both age groups, there was a dramatic decrease in performance when the cognitive load increased from 2 items to 4, but no difference between those in which the load was 4, 6, or 8 items. In the information-focus condition, the younger group showed a linear (but not steep) decrease in decision-making performance with each increase in load, except at the last — there was no difference between 6 and 8 items. The older group showed a dramatic drop when load was increased from 2 to 4, no difference between 4 and 6, and a slight drop when items increased to 8. In the emotion-focus condition, both groups showed the same pattern they had shown in the information-focus condition, except that, for the younger group, there was a dramatic drop when items increased to 8.

So that’s one point: that the effect of cognitive load is modified by instructional condition, and varies by age.

The other point, of course, concerns how level of performance varies. Interestingly, in the control condition, the two age groups performed at a similar level. In the information-focus condition, the slight superiority of the younger group when the load was lightest expanded significantly as soon as the number of items increased to four, and was greatest at the highest load. In the emotion-focus condition, however, the very slight superiority of the younger group at two items did not increase as the load increased, and indeed reversed when the load increased to eight.

Here’s what I think are the most interesting results of this study:

There was no significant difference in performance between the age groups when no instruction was given.

Younger adults were better off being given some instruction, but when the cognitive load was not too great (2, 4, 6 items), there was no difference for them in focusing on emotions or details. The difference — and it was a significant one — came when the load was highest. At this point, they were much better to concentrate on the details and apply their reasoning abilities.

Older adults, on the other hand, were better off, always but especially when the load was highest, in focusing on their feelings.

Performance on a digit-symbol coding task (a measure of processing speed) correlated significantly with performance in the information-focus condition for both age groups. When processing speed was taken into account, the difference between the age groups in that condition disappeared. In other words, younger adults' superior performance in the information-focus condition was entirely due to their higher processing speed. However, age differences in the emotion-focus condition were unaffected.

Younger adults performed significantly better in the information-focus condition compared to the control condition, indicating that specific instructions are helpful. However, there was no significant difference between the emotion-focus condition and the control for the older adults, suggesting perhaps that such processing is their ‘default’ approach.

The findings add weight to the idea that there is a separate working memory system for emotion-based information.

It should be noted that, somewhat unusually, the information was presented to participants sequentially rather than simultaneously. It may well be that these results do not apply to the situation in which you have all the necessary information presented to you in a document and can consider it at your leisure. On the other hand, in the real world we often amass information over time, or acquire it by listening rather than seeing it all nicely arrayed in front of us.

The findings suggest that the current emphasis on providing patients with all available information in order to make an “informed choice” may be misplaced. Many older patients may be better served by a greater emphasis on emotional information, rather than being encouraged to focus on myriad details.

But I'd like to see this experiment replicated using a simultaneous presentation. It may be that these findings should principally be taken as support for always seeking written documentation to back up spoken advice, or, if you're gathering information over time and from multiple sources, making sure you have written notes for each instance. Personally, I dislike making any decisions based solely on information given in conversation, and this is a reluctance I have found increasing steadily with age (and I'm not that old yet!).

References

Mikels, J.A., Löckenhoff, C.E., Maglio, S.J., Carstensen, L.L., Goldstein, M.K. & Garber, A. 2010. Following your heart or your head: Focusing on emotions versus information differentially influences the decisions of younger and older adults. Journal of Experimental Psychology: Applied, 16(1), 87-95.

Walking speed, cognitive impairment, and what to do about it

I have previously reported on how gait and balance problems have been associated with white matter lesions, and walking speed and grip strength have been associated with dementia and stroke risk. Another recent study, involving 93 older adults (70+) has added to this evidence, with the finding that those with non-amnestic MCI were much more likely to be slow walkers.

The study involved 54 seniors with no cognitive impairment, 31 with non-amnestic MCI and eight with amnestic MCI. Passive infrared sensors fixed in series on the ceilings of participants’ homes enabled their walking speed to be monitored unobtrusively over a three-year period.

Those with non-amnestic MCI were nine times more likely to be slow walkers than moderate or fast walkers, and more likely to show greater variability in walking speed.

Unfortunately, I have not been able to read the full paper (which is why I’m not reporting this in news), so I can’t tell you any more details. I assume that the main reason for the failure to find a significant difference in the amnestic MCI group was because that group was so small, but I don’t know.

Nevertheless, the study does add to the growing evidence of an association between gait and balance problems and risk of cognitive impairment and dementia, which is why I was interested to read a recent paper on entraining walking using a metronomic beat.

The paper spoke about the use of sensory cues in neurological rehabilitation. Specifically, auditory cues have been shown to help various gait characteristics of patients with Parkinson's disease and stroke. In patients with Parkinson’s, visual cues also improved stride length, while auditory cues improved cadence.

So here’s the question: if you are having gait and/or balance problems, will improving them also reduce your risk of developing cognitive problems? Or are the physical problems merely the consequence of physical deterioration in the brain that also lead to cognitive problems?

I’ve raised the same question before in relation to sensory deterioration. My answer then is the same answer I give now: you shouldn’t ignore these physical problems as something that is simply inevitable with age and/or poor health. As with sensory impairment, there are two ways in which restricted physical movement might impact your cognition.

One is the physical damage in the brain I have spoken of. Whether or not you can reverse some of this damage (or at least counteract it by developing some other area of the brain) by improving gait, balance, or grip strength, is a question as yet unanswered. But it is possible, and for that reason should be tried.

The other way is through the effect of restricted physical movement on your activities, and your state of mind. Research suggests that restricting your environment is a risk factor in developing cognitive impairment. Similarly, social engagement and cognitively-stimulating activities are both important for preventing cognitive decline, and while physical frailty doesn’t necessarily limit these, it does make it much more likely that they will be restricted.

State of mind is associated with attitude, and I have spoken before (often!) about the effect of this on cognition. If you believe that life is ‘over’ for you, that you are sliding rapidly down the hill and there is nothing you can do about it, then your belief will make that true. Physical frailty is, understandably, going to make that belief more likely. Contrariwise, if you succeed in reducing your frailty, in being able once again to do some tasks that you thought you would never be able to do again, then you are much more likely to take action in fighting cognitive decline.

So, it’s worth tackling walking problems — and worth making your best efforts to ensure that they don’t happen, by keeping fit and active. The use of sensory cues to help gait problems probably requires some specialist assistance. Another approach is by practicing tai ch’i, which is generally recommended as an activity for improving balance.

Stretching your mind

I recently reported on a finding that older adults whose life-space narrowed to their immediate home were significantly more likely to have a faster rate of global cognitive decline or develop mild cognitive impairment or Alzheimer’s.

Now there are some obvious correlates of being house-bound vs feeling able to travel out of town (such as physical disability), but this relationship between cognitive decline and confined life-space remained after such factors were taken into account. The association is thought to be related to social and mental stimulation.

But I think this association also points to something more specific: the importance of distance, and difference. Different ways of thinking; different contexts. Information (in the broadest sense of the word) that stretches your mind, that gets you out of the grooves of your familiar thoughts.

Last year I reported on a study looking at creativity in problem-solving. That study found that multicultural experiences help you become more creative in solving problems. In particular, creativity was best helped by being reminded of what you’d learned about the underlying meaning or function of behaviors in the multicultural context. In other words, what was important was truly trying to understand behavior that’s very different from your own.

While travelling undoubtedly helps, you don’t need to go to a distant place to learn about different cultures. You can read about them; you can watch movies; you can listen to other people talk about what they know. And if you have those experiences, you can then think about them at any time.

A vital tool in tackling cognitive decline in old age (including the more extreme events of mild cognitive impairment and dementia) is cognitive reserve. Cognitive reserve means that your brain can take more damage before it has noticeable effects. Many people have died with advanced Alzheimer’s pathology in their brain who showed no signs of dementia in life!

Cognitive reserve is most often associated with education, but it is also associated with occupation, bilingualism, and perhaps even music. What it comes down to is this: the more redundancy in your brain, the wider and denser the networks, the more able your brain will be to find new paths for old actions, if the old paths are damaged.

The finding that life-space can affect cognitive decline is also a reminder that we are minds in bodies. I have reported on a number of examples of what is called embodied cognition (the benefits of gesture for memory are one example of this). It’s a good general principle to bear in mind — if you fake enjoyment, you may well come to feel it; if you look at the distant hills or over the sea, your mind may think distant thoughts; if you write out your worries, the weight of them on your mind may well lighten.

I made reference to bilingualism. There have been several studies now, that point to the long-term benefits of bilingualism for fighting cognitive decline and dementia. But if you are monolingual, don’t despair. You may never achieve the fluency with another language that you would have if you’d learned it earlier in life, but it’s never too late to gain some benefit! If you feel that learning a new language is beyond you, then you’re thinking of it in the wrong way.

Learning a language is not an either-or task; you don’t have to achieve near-native fluency for there to be a point. If there’s a language you’ve always yearned to know, or a culture you’ve always been interested in, dabble. There are so many resources on the Web nowadays; there has never been a better time to learn a language! You could dabble in a language because you’re interested in a culture, or you could enhance your language learning by learning a little about an associated culture.

And don’t forget that music and math are languages too. It may be too late to become a cello virtuoso, but it’s never too late to learn a musical instrument for your own pleasure. Or if that’s not to your taste, take a music appreciation class, and enrich your understanding of the language of music.

Similarly with math: there’s a thriving little world of “math for fun” out there. Go beyond Sudoku to the world of math puzzles and games and quirky facts.

Perhaps even dance should be included in this. I have heard dance described as a language, and there has been some suggestion that dancing seems to be a physical pursuit of particular cognitive benefit for older adults.

This is not simply about ‘stimulation’. It’s about making new and flexible networks. Remember my recent report on learning speed and flexible networks? The fastest learners were those whose brains showed more flexibility during learning, with different areas of the brain being linked with different regions at different times. The key to that, I suggest, is learning and thinking about things that require your brain to forge many new paths, with speed and distance being positive attributes that you should seek out (music and dance for speed, perhaps; languages and travel for distance).

Interestingly, research into brain development has found that, as a child grows to adulthood, the brain switches from an organization based on local networks based on physical proximity to long-distance networks based on functionality. It would be interesting to know if seniors with cognitive impairment show a shrinking in their networks. Research has shown that the aging brain does tend to show reduced functional connectivity in certain high-level networks, and this connectivity can be improved with regular aerobic exercise, leading to cognitive improvement.

Don’t disdain the benefits of simply daydreaming in your armchair! Daydreaming has been found to activate areas of the brain associated with complex problem-solving, and it’s been speculated that mind wandering evokes a unique mental state that allows otherwise opposing networks to work in cooperation. Daydreaming about a more distant place has also been found to impair memory for recently learned words more than if the daydreaming concerned a closer place — a context effect that demonstrates that you can create distance effects in the privacy of your own mind, without having to venture to distant lands.

I’m not saying that such daydreaming has all the benefits of actually going forth and meeting people, seeing new sights. Watching someone practice helps you learn a skill, but it’s not as good as practicing yourself. But the point is, whatever your circumstances, there is plenty you can do to stretch your mind. Why not find yourself a travel book, and get started!

My Memory Journal

Total Cognitive Burden

Because it holds some personal resonance for me, my recent round-up of genetic news called to mind food allergies. Now food allergies can be tricky beasts to diagnose, and the reason is, they’re interactive. Maybe you can eat a food one day and everything’s fine; another day, you break out in hives. This is not simply a matter of the amount you have eaten, the situation is more complex than that. It’s a function of what we might call total allergic load — all the things you might be sensitive to (some of which you may not realize, because on their own, in the quantities you normally consume, they’re no or little problem). And then there are other factors which make you more sensitive, such as time of month (for women), and time of day. Perhaps, in light of the recent findings about the effects of environmental temperature on multiple sclerosis, temperature is another of those factors. And so on.

Now, I am not a medical doctor, nor a neuroscientist. I’m a cognitive psychologist who has spent the last 20 years reading and writing about memory. But I have taken a very broad interest in memory and cognition, and the picture I see developing is that age-related cognitive decline, mild cognitive impairment, late-onset Alzheimer’s, and early-onset Alzheimer’s, represent places on a continuum. The situation does not seem as simple as saying that these all have the same cause, because it now seems evident that there are multiple causes of dementia and cognitive impairment. I think we should start talking about Total Cognitive Burden.

Total Cognitive Burden would include genetics, lifestyle and environmental factors, childhood experience, and prenatal factors.

First, genetics.

It is estimated that around a quarter of Alzheimer’s cases are familial, that is, they are directly linked to the possession of specific gene mutations. For the other 75%, genes are likely to be a factor but so are lifestyle and environmental factors. Having said that, the most recent findings suggest that the distinction between familial and sporadic is somewhat fuzzy, so perhaps it would be fairer to say we term it familial when genetics are the principal cause, and sporadic when lifestyle and environmental factors are at least as important.

While three genes have been clearly linked to early-onset Alzheimer’s, only one gene is an established factor in late-onset Alzheimer’s — the so-called Alzheimer’s gene, the e4 allele on the APOE gene (at 19q13.2). It’s estimated that 40-65% of Alzheimer’s patients have at least one copy of this allele, and those with two copies have up to 20 times the risk of developing Alzheimer’s. Nevertheless, it is perfectly possible to have this allele, even two copies of it, and not develop the disease. It is also quite possible — and indeed a third of Alzheimer’s patients have managed it — to develop Alzheimer’s in the absence of this risky gene variant.

A recent review selected 15 genes for which there is sufficient evidence to associate them with Alzheimer’s: APOE, CLU, PICALM, EXOC3L2, BIN1, CR1, SORL1, TNK1, IL8, LDLR, CST3, CHRNB2, SORCS1, TNF, and CCR2. Most of these are directly implicated in cholesterol metabolism, intracellular transport of beta-amyloid precursor, and autophagy of damaged organelles, and indirectly in inflammatory response.

For example, five of these genes (APOE; LDLR; SORL1; CLU; TNF) are implicated in lipid metabolism (four in cholesterol metabolism). This is consistent with evidence that high cholesterol levels in midlife is a risk factor for developing Alzheimer’s. Cholesterol plays a key role in regulating amyloid-beta and its development into toxic oligomers.

Five genes (PICALM; SORL1; APOE; BIN1; LDLR) appear to be involved in the intracellular transport of APP, directly influencing whether the precursor proteins develop properly.

Seven genes (TNF; IL8; CR1; CLU; CCR2; PICALM; CHRNB2) were found to interfere with the immune system, increasing inflammation in the brain.

If you’re interested you can read more each of these genes in that review, but the point I want to make is that genes can’t be considered alone. They interact with each other, and they interact with other factors (for example, there is some evidence that SORL1 is a risk factor for women only; if you have always kept your cholesterol levels low, through diet and/or drugs, having genes that poorly manage cholesterol will not be so much of an issue). It seems reasonable to assume that the particular nature of an individual’s pathway to Alzheimer’s will be determined by the precise collection of variants on several genes; this will also help determine how soon and how fast the Alzheimer’s develops.

[I say ‘Alzheimer’s’, but Alzheimer’s is not, of course, the only path to dementia, and vascular dementia in particular is closely associated. Moreover, my focus on Alzheimer’s isn’t meant to limit the discussion. When I talk about the pathway to dementia, I am thinking about all these points on the continuum: age-related cognitive decline, mild cognitive impairment, senile dementia, and early dementia.]

It also seems plausible to suggest that the precise collection of relevant genes will determine not only which drug and neurological treatments might be most effective, but also which lifestyle and environmental factors are most important in preventing the development of the disease.

I have reported often on lifestyle factors that affect cognitive decline and dementia — factors such as diet, exercise, intellectual and social engagement — factors that may mediate risk through their effects on cardiovascular health, diabetes, inflammation, and cognitive reserve. We are only beginning to understand how childhood and prenatal environment might also have effects on cognitive health many decades later — for example, through their effects on head size and brain development.

You cannot do anything about your genes, but genes are not destiny. You cannot, now, do anything about your prenatal environment or your early years (but you may be able to do something about your children’s or your grandchildren’s). But you can, perhaps, be aware of whether you have vulnerabilities in these areas — vulnerabilities which will add to your Total Cognitive Burden. More easily, you can assess your lifestyle — over the course of your life — in these terms. Here are the sorts of questions you might ask yourself:

Do you have any health issues such as diabetes, cardiovascular disease, multiple sclerosis, positive HIV status?

Do you have a sleep disorder?

Have you, at any point in your life, been exposed to toxic elements (such as lead or severe air pollution) for a significant length of time?

Did you experience a lot of stress in childhood? Stress might come from a dangerous living environment (such as a violent neighborhood), warring parents, a dysfunctional parent, or a personally traumatic event (to take some examples).

Did you do a lot of drugs, or indulge in binge drinking, in college?

Have you spent many years eating an unhealthy diet — one heavy in fats and sugars?

Do you drink heavily?

Do you have ongoing stress in your life, or have experienced significant amounts of stress at some period during middle-age?

Do you rarely engage in exercise?

Do you spend most evenings blobbed out in front of the TV?

Do you experience little in the way of mental stimulation from your occupation or hobbies?

These questions are just off the top of my head, the ones that came most readily to mind. But they give you, I hope, some idea of the range of factors that might go to make up your TCB. The next step from there is to see what factors you can do something about. While you can’t do anything about your past, the good news is that, at any age, some benefit accrues from engaging in preventative strategies (such as improving your sleeping, reducing your stress, eating healthily, exercising regularly, engaging in mentally and socially stimulating activities). How much benefit will depend on how much effort you put into these preventative strategies, and on which and how many TCB factors are pushing you and how far you are along on the path. But it’s never too late to do something.

On the up-side, you might be relieved by such an exercise, realizing that your risk of dementia is smaller than you feared! If so, you might use this knowledge to motivate you to aspire to an excellent old age — with no cognitive decline. We tend to assume that declining faculties are an inevitable consequence of getting older, but this doesn’t have to be true. Some ‘super-agers’ have shown us that it is possible to grow very old and still perform as well as those decades younger. If your TCB is low, why don’t you make it even lower, and aspire to be one of those!

Diabetes - its role in cognitive impairment & dementia

There was an alarming article recently in the Guardian newspaper. It said that in the UK, diabetes is now nearly four times as common as all forms of cancer combined. Some 3.6 million people in the UK are thought to have type 2 diabetes (2.8 are diagnosed, but there’s thought to be a large number undiagnosed) and nearly twice as many people are at high risk of developing it. The bit that really stunned me? Diabetes costs the health service roughly 10% of its entire budget. In north America, one in five men over 50 have diabetes. In some parts of the world, it’s said as much as a quarter of the population have diabetes or even a third (Nauru)! Type 2 diabetes is six times more common in people of South Asian descent, and three times in people of African and African-Caribbean origin.

Why am I talking about diabetes in a blog dedicated to memory and learning? Because diabetes, if left untreated, has a number of complications, several of which impinge on brain function.

For example, over half of those with type 2 diabetes will die of cardiovascular disease, and vascular risk factors not only increase your chances of heart problems and stroke (diabetes doubles your risk of stroke), but also of cognitive impairment and dementia.

Type 2 diabetes is associated with obesity, which can bring about high blood pressure and sleep apnea, both of which are cognitive risk factors.

Both diabetes and hypertension increases the chances of white-matter lesions in the brain (this was even evident in obese adolescents with diabetes), and the degree of white-matter lesions in the brain is related to the severity of age-related cognitive decline and increased risk of Alzheimer’s.

Mild cognitive impairment is more likely to develop into Alzheimer’s if vascular risk factors such as high blood pressure, diabetes, cerebrovascular disease and high cholesterol are present, especially if untreated. Indeed it has been suggested that Alzheimer’s memory loss could be due to a third form of diabetes. And Down syndrome, Alzheimer's, diabetes, and cardiovascular disease, have been shown to share a common disease mechanism.

So diabetes is part of a suite of factors that act on the heart and the brain.

But treatment of such risk factors (e.g. by using high blood pressure medicines, insulin, cholesterol-lowering drugs and diet control, giving up smoking or drinking) significantly reduces the risk of developing Alzheimer’s. Bariatric surgery has been found to improve cognition in obese patients. And several factors have been shown to make a significant difference as to whether a diabetic develops cognitive problems.

Older diabetics are more likely to develop cognitive problems if they:

  • have higher (though still normal) blood pressure,
  • have gait and balance problems,
  • report themselves to be in bad health regardless of actual problems (this may be related to stress and anxiety),
  • have higher levels of the stress hormone cortisol,
  • don’t manage their condition (poor glucose control),
  • have depression,
  • eat high-fat meals.

Glucose control / insulin sensitivity may be a crucial factor even for non-diabetics. A study involving non-diabetic middle-aged and elderly people found that those with impaired glucose tolerance (a pre-diabetic condition) had a smaller hippocampus and scored worse on tests for recent memory. And some evidence suggests that a link found between midlife obesity and increased risk of cognitive impairment and dementia in old age may have to do with poorer insulin sensitivity.

Exercise and dietary changes are of course the main lifestyle factors that can turn such glucose impairment around, and do wonders for diabetes too. In fact, a recent small study found that an extreme low-calorie diet (don’t try this without medical help!) normalized pre-breakfast blood sugar levels and pancreas activity within a week, and may even have permanently cured some diabetics after a couple of months.

Diabetes appears to affect two cognitive domains in particular: executive functioning and speed of processing.

You can read all the research reports on diabetes that I’ve made over the years in my new topic collection.

Neglect your senses at your cognitive peril!

Impaired vision is common in old age and even more so in Alzheimer’s disease, and this results not only from damage in the association areas of the brain but also from problems in lower-level areas. A major factor in whether visual impairment impacts everyday function is contrast sensitivity.

Contrast sensitivity not only slows down your perceiving and encoding, it also interacts with higher-order processing, such as decision-making. These effects may be behind the established interactions between age, perceptual ability, and cognitive ability. Such interactions are not restricted to sight — they’ve been reported for several senses.

In fact, it’s been suggested that much of what we regard as ‘normal’ cognitive decline in aging is simply a consequence of having senses that don’t work as well as they used to.

The effects in Alzheimer’s disease are, I think, particularly interesting, because we tend to regard any cognitive impairment here as inevitable and a product of pathological brain damage we can’t do anything much about. But what if some of the cognitive impairment could be removed, simply by improving the perceptual input?

That’s what some recent studies have shown, and I think it’s noteworthy not only because of what it means for those with Alzheimer’s and mild cognitive impairment, but also because of the implications for any normally aging person.

So let’s look at some of this research.

Let’s start with the connection between visual and cognitive impairment.

Analysis of data from the Health and Retirement Study and Medicare files, involving 625 older adults, found that those with very good or excellent vision at baseline had a 63% reduced risk of developing dementia over a mean follow-up period of 8.5 years. Those with poorer vision who didn’t visit an ophthalmologist had a 9.5-fold increased risk of Alzheimer disease and a 5-fold increased risk of mild cognitive impairment. Poorer vision without a previous eye procedure increased the risk of Alzheimer’s 5-fold. For Americans aged 90 years or older, 78% who kept their cognitive skills had received at least one previous eye procedure compared with 52% of those with Alzheimer’s disease.

In other words, if you leave poor vision untreated, you greatly increase your risk of cognitive impairment and dementia.

Similarly, cognitive testing of nearly 3000 older adults with age-related macular degeneration found that cognitive function declined with increased macular abnormalities and reduced visual acuity. This remained true after factors such as age, education, smoking status, diabetes, hypertension, and depression, were accounted for.

And a study comparing the performance of 135 patients with probable Alzheimer’s and 97 matched normal controls on a test of perceptual organization ability (Hooper Visual Organization Test) found that the VOT was sensitive to severity of dementia in the Alzheimer’s patients.

So let’s move on to what we can do about it. Treatment for impaired vision is of course one necessary aspect, but there is also the matter of trying to improve the perceptual environment. Let’s look at this research in a bit more detail.

A 2007 study compared the performance of 35 older adults with probable Alzheimer’s, 35 healthy older adults, and 58 young adults. They were all screened to exclude those with visual disorders, such as cataracts, glaucoma, or macular degeneration. There were significant visual acuity differences between all 3 groups (median scores: 20/16 for young adults; 20/25 for healthy older adults; 20/32 for Alzheimer’s patients).

Contrast sensitivity was also significantly different between the groups, although this was moderated by spatial frequency (normal contrast sensitivity varies according to spatial frequency, so this is not unexpected). Also unsurprisingly, the young adults outperformed both older groups at every spatial frequency, except at the lowest, where it was matched by that of healthy older adults. Similarly, healthy older adults outperformed Alzheimer’s patients at every frequency bar one — the highest frequency.

For Alzheimer’s patients, there was a significant correlation between contrast sensitivity and their cognitive (MMSE) score (except at the lowest frequency of course).

Participants carried out a number of cognitive/perceptual tasks: letter identification; word reading; unfamiliar-face matching; picture naming; pattern completion. Stimuli varied in their perceptual strength (contrast with background).

Letter reading: there were no significant differences between groups in terms of accuracy, but stimulus strength affected reaction time for all participants, and this was different for the groups. In particular, older adults benefited most from having the greatest contrast, with the Alzheimer’s group benefiting more than the healthy older group. Moreover, Alzheimer’s patients seeing the letters at medium strength were not significantly different from healthy older adults seeing the letters at low strength.

Word reading: here there were significant differences between all groups in accuracy as well as reaction time. There was also a significant effect of stimulus strength, which again interacted with group. While young adults’ accuracy wasn’t affected by stimulus strength, both older groups were. Again, there were no differences between the Alzheimer’s group and healthy older adults when the former group was at high stimulus strength and the latter at medium, or at medium vs low. That was true for both accuracy and reaction time.

Picture naming: By and large all groups, even the Alzheimer’s one, found this task easy. Nevertheless, there were effects of stimulus strength, and once again, the performance of the Alzheimer’s group when the stimuli were at medium strength matched that of healthy older adults with low strength stimuli.

Raven’s Matrices and Benton Faces: Here the differences between all groups could not in general be ameliorated by manipulating stimulus strength. The exception was with the Benton Faces, where Alzheimer’s patients seeing the medium strength stimuli matched the performance of healthy older adults seeing low strength stimuli.

In summary, then, for letter reading (reaction time), word reading (identification accuracy and reaction time), picture naming, and face discrimination, manipulating stimulus strength in terms of contrast was sufficient to bring the performance of individuals with Alzheimer’s to a level equal to that of their healthy age-matched counterparts.

It may be that the failure of this manipulation to affect performance on the Raven’s Matrices reflects the greater complexity of these stimuli or the greater demands of the task. However, the success of the manipulation in the case of the Benton Faces — a similar task with stimuli of apparently similar complexity — contradicts this. It may that the stimulus manipulation simply requires some more appropriate tweaking to be effective.

It might be thought that these effects are a simple product of making stimuli easier to see, but the findings are a little more complex than I’ve rendered them. The precise effect of the manipulation varied depending on the type of stimuli. For example, in some cases there was no difference between low and medium stimuli, in others no difference between medium and high; in some, the low contrast stimuli were the most difficult, in others the low and medium strength stimuli were equally difficult, and on one occasion high strength stimuli were more difficult than medium.

The finding that Alzheimer’s individuals can perform as well as healthy older adults on letter and word reading tasks when the contrast is raised suggests that the reading difficulties that are common in Alzheimer’s are not solely due to cognitive impairment, but are partly perceptual. Similarly, naming errors may not be solely due to semantic processing problems, but also to perceptual problems.

Alzheimer’s individuals have been shown to do better recognizing stimuli the closer the representation is to the real-world object. Perhaps it is this that underlies the effect of stimulus strength — the representation of the stimulus when presented at a lower strength is too weak for the compromised Alzheimer’s visual system.

All this is not to say that there are not very real semantic and cognitive problems! But they are not the sole issue.

I said before that for Alzheimer’s patients there was a significant correlation between contrast sensitivity and their MMSE score. This is consistent with several studies, which have found that dementia severity is correlated with contrast sensitivity at some spatial frequencies. This, and these experimental findings, suggests that contrast sensitivity is in itself an important variable in cognitive performance, and contrast sensitivity and dementia severity have a common substrate.

It’s also important to note that the manipulations of contrast were standard across the group. It may well be that individualized manipulations would have even greater benefits.

Another recent study comparing the performance of healthy older and younger adults and individuals with Alzheimer's disease and Parkinson's disease on the digit cancellation test (a visual search task used in the diagnosis of Alzheimer’s), found that increased contrast brought the healthy older adults and those with Parkinson’s up to the level of the younger adults, and significantly benefited Alzheimer’s individuals — without, however, overcoming all their impairment.

There were two healthy older adults control groups: one age-matched to the Alzheimer’s group, and one age-matched to the Parkinson’s group. The former were some 10.5 years older to the latter. Interestingly, the younger control group (average age 64) performed at the same level as the young adults (average age 20), while the older old control group performed significantly worse. As expected, both the Parkinson’s group and the Alzheimer’s group performed worse than their age-matched controls.

However, when contrast was individually tailored at the level at which the person correctly identified a digit appearing for 35.5 ms 80% of the time, there were no significant performance differences between any of the three control groups or the Parkinson’s group. Only the Alzheimer’s group still showed impaired performance.

The idea of this “critical contrast” comparison was to produce stimuli that would be equally challenging for all participants. It was not about finding the optimal level for each individual (and indeed, young controls and the younger old controls both performed better at higher contrast levels). The findings indicate that poorer performance by older adults and those with Parkinson’s is due largely to their weaker contrast sensitivity, but those with Alzheimer’s are also hampered by their impaired ability to conduct a visual search.

The same researchers demonstrated this in a real-world setting, using Bingo cards. Bingo is a popular activity in nursing homes, senior centers and assisted-living facilities, and has both social and cognitive benefits.

Varying cards in terms of contrast, size, and visual complexity found that all groups benefited from increasing stimulus size and decreasing complexity. Those with mild Alzheimer’s were able to perform at levels comparable to their healthy peers, although those with more severe dementia gained little benefit.

Contrast boosting has also been shown to work in everyday environments: people with dementia can navigate more safely around their homes when objects in it have more contrast (e.g. a black sofa in a white room), and eat more if they use a white plate and tableware on a dark tablecloth or are served food that contrasts the color of the plate.

There’s a third possible approach that might also be employed to some benefit, although this is more speculative. A study recently reported at the American Association for the Advancement of Science annual conference revealed that visual deficits found in individuals born with cataracts in both eyes who have had their vision corrected can be overcome through video game playing.

After playing an action video game for just 40 hours over four weeks, the patients were better at seeing small print, the direction of moving dots, and the identity of faces.

The small study (this is not, after all, a common condition) involved six people aged 19 to 31 who were born with dense cataracts in each eye. Despite these cataracts being removed early in life, such individuals still grow up with poorer vision, because normal development of the visual cortex has been disrupted.

The game required players to respond to action directly ahead of them and in the periphery of their vision, and to track objects that are sometimes faint and moving in different directions. Best results were achieved when players were engaged at the highest skill level they could manage.

Now this is quite a different circumstance to that of individuals whose visual system developed normally but is now degrading. However, if vision worsens for some time before being corrected, or if relevant activities/stimulation have been allowed to decline, it may be that some of the deficit is not due to damage as such, but more malleable effects. In the same way that we now say that cognitive abilities need to be kept in use if they are not to be lost, perceptual abilities (to the extent that they are cognitive, which is a great extent) may benefit from active use and training.

In other words, if you have perceptual deficits, whether in sight, hearing, smell, or taste, you should give some thought to dealing with them. While I don’t know of any research to do with taste, I have reported on several studies associating hearing loss with age-related cognitive impairment or dementia, and similarly olfactory impairment. Of particular interest is the research on reviving a failing sense of smell through training, which suggested that one road to olfactory impairment is through neglect, and that this could be restored through training (in an animal model). Similarly, I have reported, more than once, on the evidence that music training can help protect against hearing loss in old age. (You can find more research on perception, training, and old age, on the Perception aggregated news page.)

 

For more on the:

Bingo study: https://www.eurekalert.org/pub_releases/2012-01/cwru-gh010312.php

Video game study:

https://www.guardian.co.uk/science/2012/feb/17/videogames-eyesight-rare-eye-disorder

https://medicalxpress.com/news/2012-02-gaming-eyesight.html

References

(In order of mention)

Rogers MA, Langa KM. 2010. Untreated poor vision: a contributing factor to late-life dementia. American Journal of Epidemiology, 171(6), 728-35.

Clemons TE, Rankin MW, McBee WL, Age-Related Eye Disease Study Research Group. 2006. Cognitive impairment in the Age-Related Eye Disease Study: AREDS report no. 16. Archives of Ophthalmology, 124(4), 537-43.

Paxton JL, Peavy GM, Jenkins C, Rice VA, Heindel WC, Salmon DP. 2007. Deterioration of visual-perceptual organization ability in Alzheimer's disease. Cortex, 43(7), 967-75.

Cronin-Golomb, A., Gilmore, G. C., Neargarder, S., Morrison, S. R., & Laudate, T. M. (2007). Enhanced stimulus strength improves visual cognition in aging and Alzheimer’s disease. Cortex, 43, 952-966.

Toner, Chelsea K.;Reese, Bruce E.;Neargarder, Sandy;Riedel, Tatiana M.;Gilmore, Grover C.;Cronin-Golomb, A. 2011. Vision-fair neuropsychological assessment in normal aging, Parkinson's disease and Alzheimer's disease. Psychology and Aging, Published online December 26.

Laudate, T. M., Neargarder S., Dunne T. E., Sullivan K. D., Joshi P., Gilmore G. C., et al. (2011). Bingo! Externally supported performance intervention for deficient visual search in normal aging, Parkinson's disease, and Alzheimer's disease. Aging, Neuropsychology, and Cognition. 19(1-2), 102 - 121.

Why your knowledge of normal aging memory matters

I’ve discussed on a number of occasions the effects that stereotypes can have on our cognitive performance. Women, when subtly reminded that females are supposedly worse at math, do more poorly on math tests; African-Americans, when subtly reminded of racial stereotypes, perform more poorly on academic tests. And beliefs about the effect of aging similarly affect memory and cognition in older adults.

Your beliefs matter. In the same way that those who believe that intelligence is fixed tend to disengage when something is challenging, while those who believe that intelligence is malleable keep working, believing that more time and effort will yield better results (see Fluency heuristic is not everyone’s rule and Regulating your study time and effort for more on this), older adults who believe that declining faculties are an inevitable consequence of aging are less inclined to make efforts to counter any decline.

Moreover, if you believe that your memory will get progressively and noticeably worse as you get older, then you will tend to pay more attention to, and give more weight to, your memory failures. This will reinforce your beliefs, and so on, feeding back on itself. Bear in mind that we all, at every age, suffer memory failures! Forgetting things is not in itself a sign of age-related decline.

It’s important, therefore, that people have a realistic idea of what to expect in ‘normal’ aging. In the course of writing a short book on this topic (it will be out, I hope, early in the new year), I came across the Knowledge of Memory Aging Questionnaire (KMAQ). Research using this questionnaire has revealed the interesting finding that people know more about pathological memory aging than they do about normal memory aging.

You may find it interesting to know some of the questions, and how likely people are to get them right. So, let's look at one of these studies, involving 150 people, divided evenly into three age-groups (40-59; 60-79; 80+).

The oldest-old scored significantly more poorly than the other two groups, although the differences weren’t great (65% correct vs 70% and 69%). There was no overall difference between genders, but males were significantly more likely to answer “Don’t know” to questions about pathological memory.

But if we focus only on the subset of four questions that relate to stereotypes about normal aging in memory, there is much greater difference between the age groups (78% correct, 69%, 52%, for middle age, young-old, and oldest-old, respectively). These are the four questions (the answers are all “false”):

  • Regardless of how memory is tested, younger adults will remember far more material than older adults.
  • If an older adult is unable to recall a specific fact (e.g., remembering a person’s name), then providing a cue to prompt or jog the memory is unlikely to help.
  • When older people are trying to memorize new information, the way they study it does not affect how much they will remember later.
  • Memory training programs are not helpful for older persons, because the memory problems that occur in old age cannot be improved by educational methods.

Only one of these questions was reliably answered correctly, and that only by the middle-age adults (If an older adult is unable to recall a specific fact, then providing a cue to prompt or jog the memory is unlikely to help.)

Looking at the individual questions, it’s interesting to see that the different age-groups show different patterns of knowledge. Middle-age adults were most likely to answer the following questions correctly (between 45 and 42 of the 50 answered correctly):

  • [Q18] Signs and symptoms of Alzheimer’s Disease show up gradually and become more noticeable to family members and close friends over time. (true)
  • [Q17] Memory for how to do well-learned things, such as reading a map or riding a bike, does not change very much, if at all, in later adulthood. (true)
  • [Q1] “A picture is worth a thousand words” in that it is easier for both younger and older people to remember pictures than to remember words. (true)
  • If an older adult is unable to recall a specific fact (e.g., remembering a person’s name), then providing a cue to prompt or jog the memory is unlikely to help. (false)

Young-old adults also scored highly on Q17 and Q1, but their other top-scorers were:

  • [Q21] If an older person has gone into another room and cannot remember what he or she had intended to do there, going back to the place where the thought first come to mind will often help one recall what he or she had intended to do. (true)
  • Confusion and memory lapses in older people can sometimes be due to physical conditions that doctors can treat so that these symptoms go away over time. (true)

The oldest-old agreed that Q21 and Q18 were easy ones (indeed, 48 and 47 got these questions right), but after that, their next top-scorer was:

  • Lifelong alcoholism may result in severe memory problems in old age. (true)

Although average education levels were similar for the three age-groups, there was greater variability within the oldest-old — 9 didn’t finish high school, but 20 had tertiary degrees. In comparison, only one middle-aged and one young-old adult didn’t finish high school. The finding that the oldest-old were more likely to answer according to stereotypes of aging memory may therefore reflect, at least in part, the lower education of some individuals.

But let’s go back to my earlier comment that those who believe poorer memory is inevitable with age give more weight to their failures while being less inclined to deal with them. This study did indeed find that changes in memory test performance over five years were correlated with subjective memory complaints, but not with use of external aids. That is, people who were forgetting more, and noticing that they were forgetting more, did not engage in greater use of strategies that would help them remember.

Something to think about!

My Memory Journal

References

Hawley, K. S., Cherry K. E., Su J. L., Chiu Y. - W., & Jazwinski M. S. (2006). Knowledge of memory aging in adulthood. International Journal of Aging & Human Development. 63(4), 317 - 334.

Aging successfully

In a recent news report, I talked about a study of older adults that found that their sense of control over their lives fluctuates significantly over the course of a day, and that this impacts on their cognitive abilities, including reasoning and memory. ‘Sense of control’ — a person’s feeling that they are (or are not) in control of their life — is an attribute that includes perceived competence, as well as locus of control, and in general it tends to decline in older adults. But obviously it is an attribute that, across the board, varies dramatically between individuals.

In older adults, a stronger sense of control is associated with more successful aging, and among people in general, with better cognitive performance. This isn’t surprising, as it is entirely consistent with related associations we have found: between strategy use and cognitive performance; between the belief that intelligence is malleable rather than fixed and cognitive performance.

My point here, however, is the connection between these findings and other aspects of successful aging that impact mental performance.

For example, I have spoken before about the association between age-related hearing loss and cognitive impairment (see this recent New York Times blog post for a very nice report on this), and poor vision and cognitive impairment.

Similarly, high blood pressure, diabetes, and depression have all been implicated in age-related cognitive decline and dementia. (For more on these, see the topic collection on diabetes, the topic collection on depression, and the new topic collection on hypertension.)

Depression, and poorer hearing and vision, are aspects of health and well-being that many seniors ignore, regarding them as no more than can be expected in old age. But their occurrence, however inevitable that may be, should not be regarded as untreatable, and seniors and their loved ones (and any with a duty of care) should be aware that by letting them go untreated, the consequences may well be more serious than they imagine.

Hypertension and diabetes, too, are medical problems that often go untreated. These problems often begin in middle age, and again, people are often unaware that their procrastination or denial may have serious implications further down the line. There is growing evidence that the roots of cognitive decline and dementia lie in your lifestyle over your lifetime, and in middle age especially.

Similarly, chronic stress may not only impair your mental performance at the time, but have long-term implications for your mental health in later old age. It is therefore an important problem to recognize and do something about for long-term health as well as present happiness. Scientific American has a self-assessment tool to help you recognize how much stress you are experiencing.

What does all this have to do with the sense of control association? Well, it seems to me that people who feel in control of their lives will be more likely to take action to deal with any of these problems; those who don’t feel in control of their lives will tend not to take such action. Thus giving up their control, and making their beliefs about the perils of aging a self-fulfilling prophecy.

A final note: my talk of treatment should not be taken as advocating a medicalized view of aging. Another aspect of aging and cognition is the widespread use of drugs among older adults. In the U.S., it’s reported that over 40% of those over 65 take five or more medications, and each year about one-third of them experience a serious adverse effect. You can read more about this in this New York Times blog article.

Hypertension, diabetes, depression, and stress are all problems that are amenable to a range of treatments, of which I personally would put drugs last.

But my point here is not to advocate specific treatments! I am a cognitive psychologist, not a medical doctor. All I wish to do in this post is provide a warning and some resources.

Building Cognitive Reserve

  • Both age-related cognitive decline and brain damage like Alzheimer's can be countered by high levels of cognitive reserve.
  • Cognitive reserve is built throughout your life, but it's never too late to make a difference.
  • You can build cognitive reserve through active learning, intellectual work, being actively bi- or multi-lingual, or regularly engaging in mentally stimulating activities.
  • To maintain (or grow) cognitive abilities, it's important both to resist the brain's tendency to shrink (brain atrophy) , and to keep it flexible (neuroplasticity).
  • Brains shrink with disuse, and grow with use.
  • Brains stay plastic through change — in activities, in strategies, in perspective.

Brain autopsies have revealed that a significant number of people die with Alzheimer’s disease evident in their brain, although in life their cognition wasn’t obviously impaired. From this, the idea of a “cognitive reserve” has arisen — the idea that brains with a higher level of neuroplasticity can continue to work apparently normally in the presence of (sometimes quite extensive) brain damage.

A comprehensive review of the research into cognitive reserve, involving 29,000 individuals across 22 studies, concluded that complex mental activity across people’s lives almost halves the risk of dementia. Encouragingly, all the studies also agreed that it was never too late to build cognitive reserve.

As you might expect, the more years of education, the greater the cognitive reserve. But education isn’t the only means of building cognitive reserve. Basically, anything that’s mentally challenging is likely to build reserve. Research supports the following as builders of cognitive reserve:

  • Education
  • Occupational complexity
  • Bilingualism
  • Social engagement
  • Regular cognitive activities, such as reading, writing, attending lectures, doing word games or puzzles, playing games such as bridge or chess.

Will cognitive reserve stop me getting Alzheimer's?

This is not to say that the highly educated will never get Alzheimer’s! Obviously they do. In fact, once those with a high level of cognitive reserve begin to show signs of the disease, they are likely to decline faster. This isn’t surprising when you consider it, because the physical damage is so much greater by the time it becomes observable in behavior.

The point of having cognitive reserve is not to prevent Alzheimer’s, in the sense of “it’ll never happen”. When we talk about “preventing” Alzheimer’s, we're really talking about delaying it. The trick is to delay it so much that you're dead before it happens!

So, cognitive reserve is desirable because it protects you against the damage that may be occurring in your brain. If you’re lucky, it’ll protect you long enough to see you through your life.

Brains are plastic, all through life

Cognitive reserve is weighted toward the past — how much you’ve built up over your lifetime — but you shouldn’t ever forget that it’s an ongoing issue. If you stop all activities that reinforce neuroplasticity, your brain is likely to enter a downward spiral, with physical deterioration resulting from and feeding into a deterioration in your motor,sensory, and cognitive systems.

As the popular mantra has it: Use it or lose it.

It’s the opposite face of expertise. You know how top musicians continue to practice everyday. Although they have tens of thousands of hours of practice under their belt, although they have reached the highest level of performance, they cannot afford to stop. This isn’t simply about improving; this is about maintaining their level of expertise. As soon as you stop, your performance starts to deteriorate.

Of course, if an expert stops working in her area of expertise, she will still maintain abilities that are far and above ‘normal’. But the point is that you can’t maintain the same level of performance without working at it.

This is true at every level. If you haven’t ridden a bike for twenty years, you’re not going to leap on it and be as good as you were thirty years ago. If you haven’t spoken your native language in twenty years, you’re not going to suddenly get into a conversation in it with all the fluency you once had.

If you stop paying attention to taste, your appreciation of taste will dull (you’re not interested, why should your brain bother putting energy into it?). If you stop trying to distinguish what people are saying, you’ll become less able to distinguish words. If you stop walking outside the house, you’ll become less capable of movement. If you stop thinking, you’ll become less able to think.

If you just do the same things over and over again, giving your brain no reason to make or reinforce or prune connections, then it won’t bother doing any of that. Why should it? Brains are energy-hounds. If you don’t want to expend the energy making it work, it’s going to sit back and let itself shrink.

Maintaining cognitive abilities as you age begins with attitude

Recent evidence suggests that being cognitively active in middle and old age may help you develop new networks when existing networks start to fail. This is consistent with evidence that older adults who maintain their cognitive abilities do so by developing new strategies that involve different regions.

In other words, if you start to have difficulties with anything, your best strategy is not to give up, but to actively explore new ways of doing it.

So, we should be aiming for two things in preventing cognitive decline. The first is in ‘growing’ brain tissue: making new neurons, and new connections. This is to counteract the shrinkage (brain atrophy) that tends to occur with age.

The second concerns flexibility. Retaining the brain’s plasticity is a vital part of fighting cognitive decline, even more vital, perhaps, than retaining brain tissue. To keep this plasticity, we need to keep the brain changing.

Here’s a question we don’t yet know the answer to: how much age-related cognitive decline is down to people steadily experiencing fewer and fewer novel events, learning less, thinking fewer new thoughts?

But we do know it matters.

What activities help build cognitive reserve?

Research hasn't systematically compared different activities to find out which are better, but the general message is that any activity that engages your mind is good. But the degree of challenge does make a difference.

One small study involving older adults found that those who randomly put in a "high-challenge" group showed significantly more cognitive improvement and more efficient brain activity, compared to those assigned to the "low-challenge" group. Moreover, even among the high-challenge group, those who spent more time on the activities showed the greatest improvements.

The high-challenge spent at least 15 hours a week for 14 weeks learning progressively more difficult skills in digital photography, quilting, or a combination of both. The low-challenge group met to socialize and engage in activities related to subjects such as travel and cooking. A control group engaged in low-demand cognitive tasks such as listening to music, playing simple games, or watching classic movies.

My Memory Journal