Intelligence isn’t as important as you think

Our society gives a lot of weight to intelligence. Academics may have been arguing for a hundred years over what, exactly, intelligence is, but ‘everyone knows’ what it means to be smart, and who is smart and who is not — right?

Of course, it’s not that simple, and the ins and outs of academic research have much to teach us about the nature of intelligence and its importance, even if they still haven’t got it all totally sorted yet. Today I want to talk about one particular aspect: how important intelligence is in academic success.

First of all, to simplify the discussion, let’s start by pretending that intelligence equals “g” and is measured by IQ testing. (“g” stands for “general factor”, and reflects the shared element between multiple cognitive tests. It is a product of a statistical technique known as factor analysis, which measures the inter-correlation between scores on various cognitive tasks. It is no surprise to any of us that cognitive tasks should be correlated — that people who do well on one task are likely to do well on others, while people who do poorly on one are likely to perform poorly on others. No surprise, either, that some cognitive tasks will be more highly correlated than others. But here’s the thing: the g factor, while it explains a lot of the individual differences in performance on an IQ test, accounts for performance on some of the component sub-tests better than others. In other words, g is more important for some cognitive tasks than others. Again, not terribly unexpected. Some tasks are going to require more ‘intelligence’ than others. One way of describing these tasks is to say that they are cognitively more complex. In the context of the IQ test, the sub-tests each have a different “g-loading”.)  

Now there is no doubting that IQ is a good predictor of academic performance, but what does that mean exactly? How good is ‘good’? Well, according to Flynn, IQ tests that are heavily-loaded on g reliably predict about 25% of the variance in academic achievement (note that this is about variance, that is the differences between people; this is not the same as saying that IQ accounts for a quarter of academic performance). But this does vary significantly depending on age and population — for example, in a group of graduate students, the relative importance of other factors will be greater than it is in a cross-section of ten-year-olds. In the study I will discuss later, the figure cited is closer to 17%.

Regardless of whether it’s as much as 25% or as little as 17%, I would have thought that these figures are much smaller than most people would imagine, given the weight that we give to intelligence.

So what are the other factors behind doing well at school (and, later, at work)?

The most obvious one is effort. One way to measure how hard people work is through the personality dimension of Conscientiousness.

One study involving 247 British university students compared the predictive power of the “Big Five” personality traits (Neuroticism, Extraversion, Openness to Experience, Agreeableness, Conscientiousness) on later exam performance, and found that Conscientiousness had a significant effect, and was the only trait to have a significantly positive effect. Illuminatingly, of Conscientiousness’s components (Competence, Order, Dutifulness, Achievement striving, Self-discipline, Deliberation), only Dutifulness, Achievement striving, and (to a lesser extent), Self-discipline, had significant effects.

There were also, smaller and less reliable, negative effects of Neuroticism and Extraversion. The problems here came mainly from Anxiety and Impulsiveness, and Gregariousness and Activity.

Overall, Dutifulness, Achievement striving, and Activity, accounted for 28% of the variance in overall exam grades (over the three years of their undergraduate degrees).

But note that these students were highly selected — undergraduates were (at this point in time) accepted to the University College London at an application: acceptance ratio of 12:1 — so IQ is going to be less important as a source of individual difference.

In another study by some of the same researchers, 80 sixth-formers (equivalent to grade 10) were given both personality and intelligence tests. Conscientiousness and Openness to Experience were found to account for 13% of unique variance in academic performance, and intelligence for 10%. Interestingly, there were subject differences. Intelligence was more important than personality for science subjects (including math), while the reverse was true for English language (literature, language) subjects.

The so-called Big Five personality dimensions are well-established, but recently a new model has introduced a sixth dimension: Honesty-Humility. Unexpectedly (to me at least), a recent study showed this dimension also has some implications for academic performance.

The first experiment in this study involved 226 undergraduate students from a School of Higher Education in the Netherlands. Both Conscientiousness and Honesty-Humility were significantly and positively correlated to grade point average (with Conscientiousness having the greater effect). All the components of Conscientiousness (in this model, Organization, Diligence, Perfectionism, Prudence) were significantly related to GPA. Three of the four components of Honesty-Humility (Greed Avoidance, Modesty, Fairness) were significantly related to GPA (in that order of magnitude).

In the second experiment, a wider data-set was used. 1262 students from the same school were given the Multicultural Personality Test—Big Six, which measures Emotional Stability, Conscientiousness, Extraversion, Agreeableness, Openness, and Integrity (a similar construct to Honesty-Humility, involving the facets Honesty, Sincerity, Greed Avoidance). Again, Conscientiousness and Integrity showed significant and positive correlations to GPA. In this case, Conscientiousness was divided into Need for Rules and Certainty, Orderliness, Perseverance, and Achievement Motivation — all of which were separately significant predictors of GPA. For Integrity, Greed Avoidance produced the largest effect, with Honesty being a smaller effect but still highly significant, while Sincerity was of more marginal significance.

In summary, personality traits such as Diligence, Achievement Motivation, Need for Rules and Certainty, Greed Avoidance, and Modesty, were the traits most strongly associated with academic performance.

Of course, one flaw in personality tests is that they rely on self-reports. A much-discussed longitudinal study of eighth-graders found that self-discipline accounted for more than twice as much variance as IQ in final grades. Moreover, self-discipline also predicted which students would improve their grades over the course of the year, which IQ didn’t.

Again, however, it should be noted that this is a selected group — the students came from a magnet public school in which students were admitted on the basis of their grades and test scores.

This study measured self-discipline not only by self-report, but also by parent report, teacher report, monetary choice questionnaires (in an initial experiment involving 140 students), a behavioral delay-of-gratification task, a questionnaire on study habits, (in a replication involving 164 students).

One personality trait that many have thought should be a factor in academic achievement is Openness to Experience, and indeed, in some experiments it has been so. It may be that Openness to Experience, which includes Fantasy (vivid imagination), Aesthetic Sensitivity, Attentiveness to Inner Feelings, Actions (engagement in novel activities), Ideas, and Values (readiness to reexamine traditional values), is associated with higher intelligence but not necessarily academic success (depending perhaps on subject?).

It may also be that, as with Neuroticism, Extraversion, and Conscientiousness, only some (or even one) of the component traits is relevant to academic performance. The obvious candidate is Ideas, described as the tendency to be intellectually curious and open to new ideas. Supporting this notion, recent research provides evidence that Openness incorporates two related but distinct factors: Intellect (Ideas) and Openness (artistic and contemplative qualities, embodied in Fantasy, Aesthetics, Feelings, and Actions), with Values a distinct marker belonging to neither camp.

A recent meta-analysis, gathering data from studies that have employed the Typical Intellectual Engagement (TIE) scale (as a widely-used proxy for intellectual curiosity), has found that curiosity had as large an effect on academic performance as conscientiousness, and together, conscientiousness and curiosity had as big an effect on performance as intelligence.

Of course, while research has shown (not unexpectedly) that Conscientiousness and Intelligence are quite independent, the correlation between Intelligence and Curiosity is surely significant. In fact, this study found a significant correlation between both TIE and Intelligence, and TIE and Conscientiousness. Nevertheless, the best-fit model indicated that all three factors were direct predictors of academic performance.

More to the point, these three important attributes all together still accounted for only a quarter of the variance in academic performance.

Regardless of the precise numbers (this area of study depends on complex statistical techniques, and I wouldn’t want to rest any case on any specific figure!), it is clear from the wealth of research (which I have barely touched on), that although intelligence is an important attribute in determining success in the classroom and in employment, it is only one among a number of important attributes. And so is Diligence. Perhaps we should spend less time praising intelligence and hard work, and more time encouraging engagement and curiosity, and a disinterest in luxury goods or a high social status.

 

Read more about the curiosity study at https://medicalxpress.com/news/2011-10-curiosity-doesnt-student.html

References

Chamorro-Premuzic, T., & Furnham, A. (2003). Personality traits and academic examination performance. European Journal of Personality, 17(3), 237-250. doi:10.1002/per.473

Duckworth, A. L., & Seligman, M. E. P. (2005). Self-discipline outdoes IQ in predicting academic performance of adolescents. Psychological science, 16(12), 939-44. doi:10.1111/j.1467-9280.2005.01641.x

Furnham, A., & Chamorro-premuzic, T. (2005). Personality and Intelligence : Gender , the Big Five , Self-Estimated and Psychometric Intelligence. International Journal of Selection and Assessment, 13(1), 11-24.

Furnham, A., Rinaldelli-Tabaton, E. & Chamorro-Premuzic, T. (2011). Personality and Intelligence Predict Arts and Science School Results in 16 Year Olds. Psychologia, 54 (1), 39-51.

von Stumm, S., Hell B., & Chamorro-Premuzic T. (2011). The Hungry Mind. Perspectives on Psychological Science. 6(6), 574 - 588.

Eating right for your brain

Although I’m a cognitive psychologist and consequently think that memory and cognition is mostly about your mastery of effective strategies, when it comes to age-related cognitive decline, I’m a big believer in the importance of diet and exercise. But while we know these things can play an important role in why some people develop cognitive impairment and even dementia as they age, and others don’t, we don’t yet know with any great certainty exactly what exercise programs would be the best use of our time, and what diet would have the most benefit.

The role of diet in fighting age-related cognitive decline is quite complex. Many older people have inadequate diets, partly no doubt because of the shrinking in appetite and perhaps the dulling of taste and smell. It seems to me, for example (and this is purely a casual observation), that sweet foods tend to be appreciated more by the elderly, while other flavors are less able to be appreciated. The problem with the shrinking appetite is that it becomes even more vital, if the quantity of food is much reduced, that the nutritional quality is good. The less you eat, the less you can afford to eat “empty calories”. Everything must count.

Other factors concern the need to fight declining physical health. Cardiovascular problems, cholesterol problems, blood pressure problems, inflammation — all these have been implicated in contributing to cognitive decline. Therefore any diet that helps you fight these problems is also helping you fight cognitive decline.

A recent Swedish study tackled the inflammation problem. The study, involving 44 overweight people aged 50-75, found that after four weeks eating foods presumed to reduce low-grade inflammation, bad (LDL) cholesterol was reduced by 33%, blood triglycerides by 14%, blood pressure by 8% and a risk marker for blood clots by 26%. Memory and cognitive function was also improved (but no details on that were reported, and at present it appears only a press release is available — no academic paper).

The diet was high in antioxidants, low-GI foods (i.e. slow release carbohydrates), omega fatty acids, wholegrain products, probiotics and viscous dietary fibre. Examples of foods eaten were oily fish, barley, soy protein, blueberries, almonds, cinnamon, vinegar and a certain type of wholegrain bread. Some of the products are not yet available in the shops, but were developed specifically for the study.

Another study, involving 712 New Yorkers, found that those who most closely followed a Mediterranean-like diet over a six-year period, were 36% less likely to have brain infarcts compared to those who were least following the diet. Such a diet has also been associated with a lower risk of Alzheimer's disease.

The Mediterranean diet includes high intake of vegetables, legumes, fruits, cereals, fish and monounsaturated fatty acids such as olive oil; low intake of saturated fatty acids, dairy products, meat and poultry; and mild to moderate amounts of alcohol.

And an 11-year study of over 3800 seniors found that those who adhered more closely to an anti-hypertension diet (DASH) maintained their cognitive performance better over time, and that this appeared due to intake of four food groups: vegetables, whole grains, low-fat dairy, nut/legumes.

Other studies have pointed to the importance of maintaining blood sugar levels.(These studies, with the exception of the Swedish study, are all ones that have been previously reported on this site.)

We can be fairly sure that fighting inflammation, hypertension, and so on, help us fend off cognitive decline and impairment in our senior years. We can also be reasonably sure that fruit and vegetables are good for us. No one’s arguing much about fish either (although you do have to consider the toxicity of the fish, especially mercury load). There’s a messy ground however over the whole carbohydrate, sugar, fat, protein, dairy ground.

Recently I read a very interesting article reviewing a new book called Good Calories, Bad Calories. In this book, the author apparently “dispels nearly every belief doctors and the public health community hold to be true about nutrition and health”. According to the blogger, “It would be easy to dismiss his claims, except that he makes his case not with theories and conjectures, but through a meticulous review of the nutrition and medical literature going back a hundred years.” Moreover, the claims do help explain some of the more puzzling quandaries about the rise of obesity.

They also, I have to say, fit in with my own experience.

The basic tenet of the book is that it is carbohydrates, and most especially refined carbohydrates, that are to blame for our current epidemics of obesity, diabetes, coronary heart disease, and even cancer. We should avoid anything made with flour, cereals, potatoes, and anything with a lot of sugar (bananas, I’m afraid, are also a no-no). We don’t, on the other hand, need to worry about meat, dairy, or fat.

This is, in fact, exactly what I have found in my own struggles with weight (although of course my reason for discussing this here is not weight per se but more fundamental physical problems). When my weight climbed to what I regarded as appalling levels, I lost the desired 20kg through a rigorous low-carbohydrate diet (although my reasons actually had more to do with trying to work through my food sensitivities). And when I say low-carbohydrate, I was actually living mainly on fruit and vegetables. I did find, after a while, that the lack of carbohydrate created an energy problem, but a quarter-cup (uncooked) of brown rice every day fixed that. When, after a couple of years, I loosened up on my diet, having some bread (gluten-free; yeast-free!), the occasional bit of baking, the occasional small bit of potato … well, my weight immediately started climbing again. I complain that I only have to look at some baking to add weight!

I’m fully conscious that this wouldn’t be everyone’s experience — I live with three males, all of whom are the tall, lean type, who can eat vast quantities of baking without it apparently having any effect. But this is my point. I think the author of this book makes some good points about the difficulties of diet research, and he may well be right in his recommendations. But even when we get to the point when we can be certain of what is a “healthy diet”, it’s still not going to be true for everyone.

So my advice to individuals is that you don’t take the disputes among health and nutrition experts as an excuse for eating what you like, but instead as a basis for exploration. Look at the various diets for which there is some evidence, and work out which ones work for you. Which will depend not only on your genetic makeup, but most particularly on the damage you’ve already done to your body (not pointing a finger! We’ve all damaged our bodies just by living). As a reminder of which I was interested to read  an interesting article in the New York Times on the high-fat diet recommended for  epileptics. 

Stretching your mind

I recently reported on a finding that older adults whose life-space narrowed to their immediate home were significantly more likely to have a faster rate of global cognitive decline or develop mild cognitive impairment or Alzheimer’s.

Now there are some obvious correlates of being house-bound vs feeling able to travel out of town (such as physical disability), but this relationship between cognitive decline and confined life-space remained after such factors were taken into account. The association is thought to be related to social and mental stimulation.

But I think this association also points to something more specific: the importance of distance, and difference. Different ways of thinking; different contexts. Information (in the broadest sense of the word) that stretches your mind, that gets you out of the grooves of your familiar thoughts.

Last year I reported on a study looking at creativity in problem-solving. That study found that multicultural experiences help you become more creative in solving problems. In particular, creativity was best helped by being reminded of what you’d learned about the underlying meaning or function of behaviors in the multicultural context. In other words, what was important was truly trying to understand behavior that’s very different from your own.

While travelling undoubtedly helps, you don’t need to go to a distant place to learn about different cultures. You can read about them; you can watch movies; you can listen to other people talk about what they know. And if you have those experiences, you can then think about them at any time.

A vital tool in tackling cognitive decline in old age (including the more extreme events of mild cognitive impairment and dementia) is cognitive reserve. Cognitive reserve means that your brain can take more damage before it has noticeable effects. Many people have died with advanced Alzheimer’s pathology in their brain who showed no signs of dementia in life!

Cognitive reserve is most often associated with education, but it is also associated with occupation, bilingualism, and perhaps even music. What it comes down to is this: the more redundancy in your brain, the wider and denser the networks, the more able your brain will be to find new paths for old actions, if the old paths are damaged.

The finding that life-space can affect cognitive decline is also a reminder that we are minds in bodies. I have reported on a number of examples of what is called embodied cognition (the benefits of gesture for memory are one example of this). It’s a good general principle to bear in mind — if you fake enjoyment, you may well come to feel it; if you look at the distant hills or over the sea, your mind may think distant thoughts; if you write out your worries, the weight of them on your mind may well lighten.

I made reference to bilingualism. There have been several studies now, that point to the long-term benefits of bilingualism for fighting cognitive decline and dementia. But if you are monolingual, don’t despair. You may never achieve the fluency with another language that you would have if you’d learned it earlier in life, but it’s never too late to gain some benefit! If you feel that learning a new language is beyond you, then you’re thinking of it in the wrong way.

Learning a language is not an either-or task; you don’t have to achieve near-native fluency for there to be a point. If there’s a language you’ve always yearned to know, or a culture you’ve always been interested in, dabble. There are so many resources on the Web nowadays; there has never been a better time to learn a language! You could dabble in a language because you’re interested in a culture, or you could enhance your language learning by learning a little about an associated culture.

And don’t forget that music and math are languages too. It may be too late to become a cello virtuoso, but it’s never too late to learn a musical instrument for your own pleasure. Or if that’s not to your taste, take a music appreciation class, and enrich your understanding of the language of music.

Similarly with math: there’s a thriving little world of “math for fun” out there. Go beyond Sudoku to the world of math puzzles and games and quirky facts.

Perhaps even dance should be included in this. I have heard dance described as a language, and there has been some suggestion that dancing seems to be a physical pursuit of particular cognitive benefit for older adults.

This is not simply about ‘stimulation’. It’s about making new and flexible networks. Remember my recent report on learning speed and flexible networks? The fastest learners were those whose brains showed more flexibility during learning, with different areas of the brain being linked with different regions at different times. The key to that, I suggest, is learning and thinking about things that require your brain to forge many new paths, with speed and distance being positive attributes that you should seek out (music and dance for speed, perhaps; languages and travel for distance).

Interestingly, research into brain development has found that, as a child grows to adulthood, the brain switches from an organization based on local networks based on physical proximity to long-distance networks based on functionality. It would be interesting to know if seniors with cognitive impairment show a shrinking in their networks. Research has shown that the aging brain does tend to show reduced functional connectivity in certain high-level networks, and this connectivity can be improved with regular aerobic exercise, leading to cognitive improvement.

Don’t disdain the benefits of simply daydreaming in your armchair! Daydreaming has been found to activate areas of the brain associated with complex problem-solving, and it’s been speculated that mind wandering evokes a unique mental state that allows otherwise opposing networks to work in cooperation. Daydreaming about a more distant place has also been found to impair memory for recently learned words more than if the daydreaming concerned a closer place — a context effect that demonstrates that you can create distance effects in the privacy of your own mind, without having to venture to distant lands.

I’m not saying that such daydreaming has all the benefits of actually going forth and meeting people, seeing new sights. Watching someone practice helps you learn a skill, but it’s not as good as practicing yourself. But the point is, whatever your circumstances, there is plenty you can do to stretch your mind. Why not find yourself a travel book, and get started!

My Memory Journal

Even mild head injuries can seriously affect the brain

Traumatic brain injury is the biggest killer of young adults and children in the U.S., and in a year more Americans suffer a TBI than are diagnosed with breast, lung, prostate, brain and colon cancer combined. There are many causes of TBI, but one of the more preventable is that of sports concussion.

This week Pennsylvania became the 35th state in the U.S. to have a youth-concussion law. Since I recently uploaded a topic collection on TBI (traumatic brain injury), this seems an appropriate time to talk a little about sports concussions and their possible long-time repercussions.

In 2009, a study commissioned by the National Football League reported that Alzheimer’s disease or similar memory-related diseases had been diagnosed in the league’s former players dramatically more often than in the national population: five times the national average among those 50 and older (6.1%) and 19 times for those aged 30 through 49.

This follows a 2005 study that found retired National Football League players had a 37% higher risk of Alzheimer's than other U.S. males of the same age. Those who had experienced three or more concussions had a five-fold greater chance of having been diagnosed with mild cognitive impairment and a three-fold prevalence of reported significant memory problems compared to those players without a history of concussion.

Most recently, a follow-up of nearly 4,000 retired National Football League players surveyed in 2001 found that 35% appeared to have significant cognitive problems. When 41 of them were tested, they were found to have mild cognitive impairment that resembled a comparison group of much older patients from the general population.

Now, you might (if you’re a parent) console yourself with the thought that professional football players are likely to be involved in much greater impacts than those suffered by your child on the sport’s field. But unfortunately there is growing evidence that even mild concussions can produce long-lasting trauma to the brain.

For example, monitoring of 11 high school football players found that some players who hadn't been diagnosed with concussions nevertheless had developed changes in brain function following head impacts, and these changes correlated with cognitive impairment. Brain scans have also revealed abnormalities in white matter at all levels of severity in traumatic brain injury, even in those who had minimal or no loss of consciousness, and those with no self-reported cognitive deficit. And analysis of medical records on over 280,000 older U.S. military veterans found that severity of brain injury made no difference to the increased likelihood that they would develop dementia.

Not all impacts are equally bad. There’s some evidence that the area to the top and front of the head (just above the dorsolateral prefrontal cortex) is particularly vulnerable.

Another danger sign is headaches. A study found that young athletes who experienced migraine headache symptoms (even one week after concussion) were likely to have increased cognitive impairment, and shouldn’t return to play before the headache resolves.

Children exposed to lead early in life might also be especially vulnerable to the effects of head injury. Rat studies have found that young rats exposed to low levels of lead don’t recover from brain injury as well as those not so exposed.

Head trauma shouldn’t be accepted fatalistically. There are actions you can take to ameliorate its effects (if you don’t want to remove yourself from the risky situations). What these findings emphasize is the importance of treating even mild head injuries, of giving your brain time to repair itself, and of following a regime designed to mitigate damage: exercising, eating a healthy diet, reducing stress, and so on.

My recent report on transient global amnesia demonstrates the incredible ability of the brain to repair itself — but it must be given time to do so before subjecting it to more trauma. According to a leading tracker of youth sports injuries, returning to play too soon is a trend that occurs in roughly 40% of sports-related concussions of student football players.

The three main provisions of Washington state's Zackery Lystedt law, considered by the National Football League to be model youth-concussion legislation, are:

  • a student-athlete's parent or guardian must sign a concussion-awareness information form before the student-athlete is eligible to participate in school athletics;
  • any student-athlete suspected of a concussion must immediately be removed from play;
  • any student-athlete who has a concussion must obtain medical clearance before being allowed to return to practice or competition.

Many states also require some form of concussion training for coaches.

References

References (and more details) for the studies I have mentioned can be found in my topic collection on TBI.

The fallibility of human memory

I don't often talk about eyewitness testimony, but it's not because of the lack of research. It's a big field, with a lot of research done. When  I say I don't follow it because I regard the main finding as a done deal - eyewitness testimony is useless - that's not meant to denigrate the work being done. There is, clearly, a great deal of value in working out the exact parameters of human failures, and in working out how we can improve eyewitness testimony. I just arbitrarily decided to ignore this area of research until they'd sorted it all out! (I can't follow everything, I'm swamped as it is!)

Nevertheless, I do want to remark on a recent report in The Scientist, to the effect that a New Jersey court has decreed that all juries must be informed of the unreliability of eyewitness testimony. I want to raise a hearty cheer. I regard it as practically criminal that eyewitness testimony is given the weight it is. I think everyone should be taught, from a young age, that memory is completely unreliable. And, in particular, that the certainty you hold in any specific memory, and the vividness it has, are not nearly as good proofs of the accuracy of the memory as we tend to believe.

You may think a belief in the fallibility of memory would create an unpleasant state of uncertainty, but I believe it would bring about a useful decline in many individuals' dogmatic certainty, and encourage more empathy with other, fallible human beings.

You may ask how my emphasis on the fallibility of human memory squares with my frequent comments on the danger of believing that you have a bad memory or that your memory will inevitably get worse as you age. But believing in human fallibility is very different from believing you personally have a bad or deteriorating memory. You need to find a nice balance between these beliefs, and part of achieving that lies in understanding how memory works and what aspects are more reliable and which less. I hope my site helps you with that!

Normal is a label too

We all like simple solutions. However much we may believe we are ‘above’ black-&-white dichotomies, that of course we understand that every situation is complex, nevertheless we have a brain that can only think of a very very few things at once. So it's unsurprising that we are drawn to solutions that can be summed up simply, that can fit comfortably within the limitations of working memory.

Here’s something I read about in Scientific American the other day: Huntington’s disease — which is a terrible disease that eats away at your brain, causing both physical and cognitive disabilities that continue to deteriorate until the sufferer dies an untimely death — is linked to an excess of a brain chemical (the neurotransmitter glutamate) that is in fact vital for learning and memory. Intriguingly, a recent study has found that those with the genetic mutation for this disease, but who were as yet asymptomatic, were significantly quicker to learn than those without the mutation. Indeed, those with the greatest number of copies of the mutation were the fastest to learn.

This may not simply be a matter of disease progression — an earlier study found that Huntington’s patients did better on one cognitive task than healthy controls (detecting whether a tone was long or short). It may be, the researchers suggested, that it is simplistic to talk of a decline in cognitive function in Huntington’s, rather some functions might be enhanced while others are impaired.

Huntington’s Disease is hardly alone in this. We often talk about ‘normal’ memory aging, and there’s no denying the concept of normal is a useful one — in certain contexts. But not, perhaps, in all those contexts in which it is used.

Psychology, as I’ve mentioned before, has historically been a search for what is ‘normal’ in human behavior. What is not normal is deemed ‘abnormal’ — a nice black & white dichotomy. Of course this is simplistic, but it gives us a place to stand. But now that psychology is a mature discipline, it can look around, explore the variability in human behavior. However it is only very very recently that we have begun to realize that the search for normal is merely a starting point to the question of what it is to be human, and it has become a straight-jacket.

As an example, let’s look briefly at something discussed in a provocative article about autism that appeared in the journal Nature. The writer of the article, Dr. Laurent Mottron, leads a research team that includes several autistic individuals. As a consequence, he has grown to appreciate the strengths that such individuals can bring to research.

The main thrust of his argument is that autism is not simply a “suite of negative characteristics”. There are advantages in some autistic characteristics. But because autism is defined as a ‘disorder’, researchers and clinicians systematically interpret behaviors in terms of impairment and abnormalities. More useful would be to examine each behavior on its own merits, and consider whether the behavior might be adaptive in certain contexts.

Mottron says that although intellectual disability is routinely estimated to be about 75% among autistics, only 10% of autistics have an accompanying neurological disease that affects intelligence, and if researchers used only those tests that require no verbal explanation, the level of intellectual disability would be seen to be much lower. An interesting comparison: “In measuring the intelligence of a person with a hearing impairment, we wouldn't hesitate to eliminate components of the test that can't be explained using sign language; why shouldn't we do the same for autistics?”

Mottron’s research term have coined a telling word: normocentrism, meaning the preconception you have that if you do or are something, it is normal, and if autistic do or have it, it is abnormal — I think this term could be usefully applied more widely. Similarly, the rise of the concept of ‘neurodiversity’ in the autistic community (whereby a ‘normal’ is ‘neurotypical’ and someone with an autism spectrum disorder is ‘neurodiverse’) could also be applied more widely. Rather than distinguishing between the two types, we should see human diversity as represented by a spectrum, where ‘neurotypical’ covers a wide middle range, and other ‘disorders’, such as autism, dyslexia, and ADHD, similarly occupy a range along the spectrum.

Because this is the point, this is what research has been revealing over the past few years: there is no such ‘thing’ (as in a single thing) as autism, as dyslexia, as ADHD, as Alzheimer’s. They all have multiple variants — variable characteristics; variable causes. Because they reflect subtly different differences in the brain.

Which means we shouldn’t assume that because something has a label (“Alzheimer’s”), there is only one path (and relatedly, one set of dangerous factors). For example, we ‘know’ that high blood pressure is bad, and certainly it’s an important risk factor for cardio- and cerebro-vascular disorders (including Alzheimer’s). And yet, according to a recent study, this is not the complete story. For the very elderly (say, 85+), high blood pressure may be a sign of better health. This isn’t just because risk factors are worked out on the basis of group studies while you are an individual (there is always individual variation). It’s also because almost everything has trade-offs. Like the Huntington’s disease gene variant that improves learning. Like the neurodiverse who have exceptional visual skills.

Similarly, just because someone has put a label on you (“dyslexic”), you shouldn’t assume that means that everything you know about dyslexia applies to you. Nor should you assume that there are no positives about your particular brain.

In the same way, you shouldn’t assume that being a ‘genius’, or having a ‘photographic memory’, is all positive. Everything is a trade-off (don’t mistake me, I’m not suggesting that there is something positive about Alzheimer’s! but it may be that humans are vulnerable to Alzheimer’s because of our superior brains, and because we live so long).

The message is, don’t simply fall prey to a label. Think about it. If you or someone you care for has been labeled, focus on the individual manifestations, not the label. The label is a guide — treat it as one. But never forget that each manifestation will have its own particular characteristics, some of which may be positive.

And 'normal' is a label, too. Here's an issue that's only recently become realized in the cognitive research community: our idea of what is 'normal' is largely based on one cultural group. Most cognitive research has been undertaken on American undergraduate students (according to a recent analysis, 96% of research subjects in a sample of hundreds of psychology studies came from Western industrialized countries, and 68% came specifically from the U.S. — of these, 67% were psychology students). In recent years, it has become evident that WEIRD people (those from Western, Educated, Industrialized, Rich, and Democratic societies) respond significantly differently on a whole lot of domains compared to non-Western groups — even on something as seemingly basic as a visual illusion. (see Scientific American for a nice discussion of this)

As I said at the beginning, our preference for simple solutions and simple labels is rooted in our limited working memory capacity. The only real way around this is to build up your knowledge piece by piece, so that the items in working memory are merely the tips of richly elaborated items held in long-term memory. That isn't quick or easy, so there'll be many areas in which you don't want to gather such elaborated knowledge. In the absence of being able to stretch the limits of working memory, it helps to at least be aware of what is limiting your thinking.

In other words, as with memory itself, you need to think about your own goals and interests, and choose those that you want to pursue. Becoming expert (or at least, a little bit expert!) in some areas shows you how different your thinking is in those areas; you will then be able to properly appreciate the limitations in your thinking in other areas. That’s not a bad thing! As with memory failures of other kinds, it’s a big step just to be aware of your fallibilities. Better that than to be fooled (as some experts are) into thinking that their expert thinking in one area means that they think equally clearly in other areas.

We are all fallible. We all forget. We all have false memories and believe in them. We all sometimes fall victim to labels. The trick is to realize our fallibility, and choose the occasions and domains in which to overcome it.

References

Mottron, L. (2011). Changing perceptions: The power of autism. Nature. 479(7371), 33 - 3

Is multitasking really a modern-day evil?

In A Prehistory of Ordinary People, anthropologist Monica Smith argues that rather than deploring multitasking, we should celebrate it as the human ability that separates us from other animals.

Her thesis that we owe our success to our ability to juggle multiple competing demands and to pick up and put down the same project until completion certainly makes a good point. Yes, memory and imagination (our ability to project into the future) enable us to remember the tasks we’re in the middle of, and allow us to switch between tasks. And this is undeniably a good thing.

I agree (and I don’t think have ever denied) that multitasking is not in itself ‘bad’. I don’t think it’s new, either. These are, I would suggest, straw men — but I’m not decrying her raising them. Reports in the media are prone to talking about multitasking as if it is evil and novel, and a symptom of all that is wrong in modern life. It is right to challenge those assumptions.

The problem with multitasking is not that it is inherently evil. The point is to know when to stop.

There are two main dangers with multitasking, which we might term the acute and the chronic. The acute danger is when we multitask while doing something that has the potential to risk our own and others’ safety. Driving a vehicle is the obvious example, and I have reported on many studies over the past few years that demonstrate the relative dangers of different tasks (such as talking on a cellphone) while driving a car. Similarly, interruptions in hospitals increase the probability of clinical errors, some of which can have dire consequences. And of course on a daily level, acute problems can arise when we fail to do one task adequately because we are trying to do other tasks at the same time.

A chronic danger of multitasking that has produced endless articles in recent years is the suggestion that all this technology-driven multitasking is making us incapable of deep thought or focused attention.

But Smith argues that we do not, in fact, engage in levels of multitasking that are that much different from those exhibited in prehistoric times. ‘That much’ is of course the get-out phrase. How much difference is too much? Is there a point at which multitasking is too much, and have we reached it?

These are the real questions, and I don’t think the answer is something we can draw a line with. Research with driver-multitasking has revealed significant differences between drivers, as a function of age, as a function of personal attributes, as a function of emotional or physical state. It has revealed differences between tasks —e.g. talking that involves emotions or decisions is more distracting than less engaging conversation; half-overheard conversations are surprisingly distracting (suggesting that having a passenger in the car talking on a phone may be more distracting than doing it yourself!). These are the sort of things we need to know — not that multitasking is bad, but when it is bad.

This approach applies to the chronic problem also, although it is much more difficult to study. But these are some of the questions we need to know the answers to:

  • Does chronic multitasking affect our long-term ability to concentrate, or only our ability to concentrate while in the multitasking environment?
  • If it does affect our long-term ability to concentrate, can we reverse the effect? If so, how?
  • Is the effect on children and adolescents different from that of adults?
  • Does chronic multitasking produce beneficial cognitive effects? If so, is this of greater benefit for some people rather than others? (For example, multitasking training may benefit older adults)
  • What are the variables in multitasking that affect our cognition in these ways? (For example, the number of tasks being performed simultaneously; the length of time spent on each one before switching; the number of times switching occurs within a defined period; the complexity of the tasks; the ways in which these and other factors might interact with temporary personal variables, such as mood, fatigue, alcohol, and more durable personal variables such as age and personality)

We need to be thinking in terms of multitasking contexts rather than multitasking as one uniform (and negative) behavior. I would be interested to hear your views on multitasking contexts you find beneficial, pleasant or useful, and contexts you find difficult, unpleasant or damaging.

Shaping your cognitive environment for optimal cognition

Humans are the animals that manipulate their cognitive environment.

I reported recently on an intriguing study involving an African people, the Himba. The study found that the Himba, while displaying an admirable amount of focus (in a visual perception task) if they were living a traditional life, showed the same, more de-focused, distractible attention, once they moved to town. On the other hand, digit span (a measure of working memory capacity) was smaller in the traditional Himba than it was in the urbanized Himba.

This is fascinating, because working memory capacity has proved remarkably resistant to training. Yes, we can improve performance on specific tasks, but it has proven more difficult to improve the general, more fundamental, working memory capacity.

However, there have been two areas where more success has been found. One is the area of ADHD, where training has appeared to be more successful. The other is an area no one thinks of in this connection, because no one thinks of it in terms of training, but rather in terms of development — the increase in WMC with age. So, for example, average WMC increases from 4 chunks at age 4, to 5 at age 7, 6 at age 10, to 7 at age 16. It starts to decrease again in old age. (Readers familiar with my work will note that these numbers are higher than the numbers we now tend to quote for WMC — these numbers reflect the ‘magic number 7’, i.e. the number of chunks we can hold when we are given the opportunity to actively maintain them.)

Relatedly, there is the Flynn effect. The Flynn effect is ostensibly about IQ (specifically, the rise in average IQ over time), but IQ has a large WM component. Having said that, when you break IQ tests into their sub-components and look at their change over time, you find that the Digit Span subtest is one component that has made almost no gain since 1972.

But of course 1972 is still very modern! There is no doubt that there are severe constraints on how much WMC can increase, so it’s reasonable to assume we long since hit the ceiling (speaking of urbanized Western society as a group, not individuals).

It’s also reasonable to assume that WMC is affected by purely physiological factors involving connectivity, processing speed and white matter integrity — hence at least some of the age effect. But does it account for all of it?

What the Himba study suggests (and I do acknowledge that we need more and extended studies before taking these results as gospel), is that urbanization provides an environment that encourages us to use our working memory to its capacity. Urbanization provides a cognitively challenging environment. Our focus is diffused for that same reason — new information is the norm, rather than the exception; we cannot focus on one bit unless it is of such threat or interest that it justifies the risk.

ADHD shows us, perhaps, what can happen when this process is taken to the extreme. So we might take these three groups (traditional Himba, urbanized Himba, individuals with ADHD) as points on the same continuum. The continuum reflects degree of focus, and the groups reflect environmental effects. This is not to say that there are not physiological factors predisposing some individuals to react in such a way to the environment! But the putative effects of training on ADHD individuals points, surely, to the influence of the environment.

Age provides an intriguing paradox, because as we get older, two things tend to happen: we have a much wider knowledge base, meaning that less information is new, and we usually shrink our environment, meaning again that less information is new. All things being equal, you would think that would mean our focus could afford to draw in. However, as my attentive readers will know, declining cognitive capacity in old age is marked by increasing difficulties in ignoring distraction. In other words, it’s the urbanization effect writ larger.

How to account for this paradox?

Perhaps it simply reflects the fact that the modern environment is so cognitively demanding that these factors aren’t sufficient on their own to enable us to relax our alertness and tighten our focus, in the face of the slowdown in processing speed that typically occurs with age (there’s some evidence that it is this slowdown that makes it harder for older adults to suppress distracting information). Perhaps the problem is not simply, or even principally, the complexity of our environment, but the speed of it. You only have to compare a modern TV drama or sit-com with one from the 70s to see how much faster everything now moves!

I do wonder if, in a less cognitively demanding environment, say, a traditional Himba village, whether WMC shows the same early rise and late decline. In an environment where change is uncommon, it is natural for elders to be respected for their accumulated wisdom — experience is all — but perhaps this respect also reflects a constancy in WMC (and thus ‘intelligence’), so that elders are not disadvantaged in the way they may be in our society. Just a thought.

Here’s another thought: it’s always seemed to me (this is not in any way a research-based conclusion!) that musicians and composers, and writers and professors, often age very well. I’ve assumed this was because they are keeping mentally active, and certainly that must be part of it. But perhaps there’s another reason, possibly even a more important reason: these are areas of expertise where the proponent spends a good deal of time focused on one thing. Rather than allowing their attention to be diffused throughout the environment all the time, they deliberately shut off their awareness of the environment to concentrate on their music, their writing, their art.

Perhaps, indeed, this is the shared factor behind which activities help fight age-related cognitive decline, and which don’t.

I began by saying that humans are the animals that manipulate their cognitive environment. I think this is the key to fighting age-related cognitive decline, or ADHD if it comes to that. We need to be aware how much our brains try to operate in a way that is optimal for our environment — meaning that, by controlling our environment, we can change the way our brain operates.

If you are worried about your ‘scattiness’, or if you want to prevent or fight age-related cognitive decline, I suggest you find an activity that truly absorbs and challenges you, and engage in it regularly.

The increase in WMC in Himba who moved to town also suggests something else. Perhaps the reason that WM training programs have had such little success is because they are ‘programs’. What you do in a specific environment (the bounds of a computer and the program running on it) does not necessarily, or even usually, transfer to the wider environment. We are contextual creatures, used to behaving in different ways with different people and in different places. If we want to improve our WMC, we need to incorporate experiences that challenge and extend it into our daily life.

This, of course, emphasizes my previous advice: find something that absorbs you, something that becomes part of your life, not something you 'do' for an hour some days. Learn to look at the world in a different way, through music or art or another language or a passion (Civil War history; Caribbean stamps; whatever).

You can either let your cognitive environment shape you, or shape your cognitive environment.

Do you agree? What's your cognitive environment, and do you think it has affected your cognitive well-being?

Improving attention through nature

Until recent times, attention has always been quite a mysterious faculty. We’ve never doubted attention mattered, but it’s only in the past few years that we’ve appreciated how absolutely central it is for all aspects of cognition, from perception to memory. The rise in our awareness of its importance has come in the wake of, and in parallel with, our understanding of working memory, for the two work hand-in-hand.

In December 2008, I reported on an intriguing study (go down to "Previous study")that demonstrated the value of a walk in the fresh air for a weary brain. The study involved two experiments in which researchers found memory performance and attention spans improved by 20% after people spent an hour interacting with nature. There are two important aspects to this finding: the first is that this effect was achieved by walking in the botanical gardens, but not by walking along main streets; the second — far less predictable, and far more astonishing — was that this benefit was also achieved by looking at photos of nature (versus looking at photos of urban settings).

Now, most of us can appreciate that a walk in a natural setting will clear a foggy brain, and that this is better than walking busy streets — even if we have no clear understanding of why that should be. But the idea that the same benefit can accrue merely from sitting in a room and looking at pictures of natural settings seems bizarre. Why on earth should that help?

Well, there’s a theory. Attention, as we all know, even if we haven’t articulated it, has two components (three if you count general arousal). These two components, or aspects, of attention are involuntary or captured attention, and voluntary or directed attention. The first of these is exemplified by the situation when you hear a loud noise, or someone claps you on the shoulder. These are events that grab your attention. The second is the sort you have control over, the attention you focus on your environment, your work, your book. This is the type of attention we need, and find so much more elusive as we get older.

Directed attention has two components to it: the direct control you exert, and the inhibition you apply to distracting events, to block them out. As I’ve said on a number of occasions, it is this ability to block out distraction that is particularly affected by age, and is now thought to be one of the major reasons for age-related cognitive impairment.

Now, this study managed to isolate the particular aspects of attention that benefited from interacting with nature. The participants were tested on three aspects: alerting, orienting, and executive control. Alerting is about being sensitive to incoming stimuli, and was tested by comparing performance on trials in which the participant was warned by a cue that a trial was about to begin, and trials where no warning was given. Alerting, then, is related to arousal — it’s general, not specifically helpful about directing your attention.

Orienting, on the other hand, is selective. To test this, some trials were initiated by a spatial cue directing the participant’s attention to the part of the screen in which the stimulus (an arrow indicating direction) would appear.

Executive control also has something to do with directed attention, but it is about resolving conflict between stimuli. It was tested through trials in which three arrows were displayed, sometimes all pointing in the same direction, other times having the distracter arrows pointing in the opposite direction to the target arrow. So this measures how well you can ignore distraction.

So this is where the findings get particularly interesting: it seems that looking at pictures of nature benefited executive control, but not alerting or orienting.

Why? Well, attention restoration theory posits that a natural environment gives your attentional abilities a chance to rest and restore themselves, because there are few elements that capture your attention and few requirements for directed attention. This is more obvious when you are actually present in these environments; it’s obvious that on a busy city street there will be far more things demanding your attention.

The fact that the same effect is evident even when you’re looking at pictures echoes, perhaps, recent findings that the same parts of the brain are activated when we’re reading about something or watching it or doing it ourselves. It’s another reminder that we live in our brains, not the world. (It does conjure up another intriguing notion: does the extent to which pictures are effective correlate with how imaginative the person is?)

It’s worth noting that mood also improved when the study participants walked in the park rather than along the streets, but this didn’t appear to be a factor in their improved cognitive performance; however, the degree to which they felt mentally refreshed did correlate with their performance. Confirming these results, mood wasn’t affected by viewing pictures of nature, but participants did report that such pictures were significantly more refreshing and enjoyable.

Now, I’ve just reported on a new study that seems to me to bear on this issue. The study compared brain activity when participants looked at images of the beach and the motorway. The researchers chose these contrasting images because they are associated with very similar sounds (the roar of waves is acoustically very similar to the roar of traffic), while varying markedly in the feelings evoked. The beach scenes evoke a feeling of tranquility; the motorway scenes do not.

I should note that the purpose of the researchers was to look at how a feeling (a sense of tranquility) could be evoked by visual and auditory features of the environment. They do not refer to the earlier work that I have been discussing, and the connection I am making between the two is entirely my own speculation.

But it seems to me that the findings of this study do provide some confirmation for the findings of the earlier study, and furthermore suggest that such natural scenes, whether because of the tranquility they evoke or their relatively low attention-demanding nature or some other reason, may improve attention by increasing synchronization between relevant brain regions.

I’d like to see these studies extended to older adults (both of them were small, and both involved young adults), and also to personality variables (do some individuals benefit more from such a strategy than others? Does reflect particular personality attributes?). I note that another study found reduced connectivity in the default mode network in older adults. The default mode network may be thought of as where your mind goes when it’s not thinking of anything in particular; the medial prefrontal cortex is part of the default mode network, and this is one of the reasons it was a focus of the most recent study.

In other words, perhaps natural scenes refresh the brain by activating the default mode network, in a particularly effective way, allowing your brain to subsequently return to action (“task-positive network”) with renewed vigor (i.e. nicely synchronized brainwaves).

Interestingly, another study has found a genetic component to default-mode connectivity (aberrant DMN connectivity is implicated in a number of disorders). It would be nice to see some research into the effect of natural scenes on attention in people who vary in this attribute.

Meditation is of course another restorative strategy, and I’d also like to see a head-to-head comparison of these two strategies. But in any case, bottom-line, these results do suggest an easy way of restoring fading attention, and because of the specific aspect of attention that is being helped, it suggests that the strategy may be of particular benefit to older adults. I would be interested to hear from any older adults who try it out.

[Note that part of this article first appeared in the December 2008 newsletter]

Benefits from fixed quiet points in the day

On my walk today, I listened to a downloaded interview from the On Being website. The interview was with ‘vocal magician and conductor’ Bobby McFerrin, and something he said early on in the interview really caught my attention.

In response to a question about why he’d once (in his teens) contemplated joining a monastic order, he said that the quiet really appealed to him, and also ‘the discipline of the hours … there’s a rhythm to the day. I liked the fact that you stopped whatever you were doing at a particular time and you reminded yourself, you brought yourself back to your calling’.

Those words resonated with me, and they made me think of the Moslem habit of prayer. Of the idea of having specified times during the day when you stop your ‘ordinary’ life, and touch base, as it were, with something that is central to your being.

I don’t think you need to be a monk or a Moslem to find value in such an activity! Nor does the activity need to be overtly religious.

Because this idea struck another echo in me — some time ago I wrote a brief report on how even a short ‘quiet time’ can help you consolidate your memories. It strikes me that developing the habit of having fixed points in the day when (if at all possible) you engage in some regular activity that helps relax you and center your thoughts, would help maintain your focus during the day, and give you a mental space in which to consolidate any new information that has come your way.

Appropriate activities could include:

  • meditating on your breath;
  • performing a t’ai chi routine;
  • observing nature;
  • listening to certain types of music;
  • singing/chanting some song/verse (e.g., the Psalms; the Iliad; the Tao te Ching)

Regarding the last two suggestions, as I reported in my book on mnemonics, there’s some evidence that reciting the Iliad has physiological effects on synchronizing heartbeat and breath that is beneficial for both mood and cognitive functioning. It’s speculated that the critical factor might be the hexametric pace (dum-diddy, dum-diddy, dum-diddy, dum-diddy, dum-diddy, dum-dum). Dactylic hexameter, the rhythm of classical epic, has a musical counterpart: 6/8 time.

Similarly, another small study found that singing Ave Maria in Latin, or chanting a yoga mantra, likewise affects brain blood flow, and the crucial factor appeared to be a rhythm that involved breathing at the rate of six breaths a minute.

Something to think about!