Skip to main content

improving attention

Ways to help you improve your ability to direct and sustain your attention

Improving attention through nature

Until recent times, attention has always been quite a mysterious faculty. We’ve never doubted attention mattered, but it’s only in the past few years that we’ve appreciated how absolutely central it is for all aspects of cognition, from perception to memory. The rise in our awareness of its importance has come in the wake of, and in parallel with, our understanding of working memory, for the two work hand-in-hand.

In December 2008, I reported on an intriguing study (go down to "Previous study")that demonstrated the value of a walk in the fresh air for a weary brain. The study involved two experiments in which researchers found memory performance and attention spans improved by 20% after people spent an hour interacting with nature. There are two important aspects to this finding: the first is that this effect was achieved by walking in the botanical gardens, but not by walking along main streets; the second — far less predictable, and far more astonishing — was that this benefit was also achieved by looking at photos of nature (versus looking at photos of urban settings).

Now, most of us can appreciate that a walk in a natural setting will clear a foggy brain, and that this is better than walking busy streets — even if we have no clear understanding of why that should be. But the idea that the same benefit can accrue merely from sitting in a room and looking at pictures of natural settings seems bizarre. Why on earth should that help?

Well, there’s a theory. Attention, as we all know, even if we haven’t articulated it, has two components (three if you count general arousal). These two components, or aspects, of attention are involuntary or captured attention, and voluntary or directed attention. The first of these is exemplified by the situation when you hear a loud noise, or someone claps you on the shoulder. These are events that grab your attention. The second is the sort you have control over, the attention you focus on your environment, your work, your book. This is the type of attention we need, and find so much more elusive as we get older.

Directed attention has two components to it: the direct control you exert, and the inhibition you apply to distracting events, to block them out. As I’ve said on a number of occasions, it is this ability to block out distraction that is particularly affected by age, and is now thought to be one of the major reasons for age-related cognitive impairment.

Now, this study managed to isolate the particular aspects of attention that benefited from interacting with nature. The participants were tested on three aspects: alerting, orienting, and executive control. Alerting is about being sensitive to incoming stimuli, and was tested by comparing performance on trials in which the participant was warned by a cue that a trial was about to begin, and trials where no warning was given. Alerting, then, is related to arousal — it’s general, not specifically helpful about directing your attention.

Orienting, on the other hand, is selective. To test this, some trials were initiated by a spatial cue directing the participant’s attention to the part of the screen in which the stimulus (an arrow indicating direction) would appear.

Executive control also has something to do with directed attention, but it is about resolving conflict between stimuli. It was tested through trials in which three arrows were displayed, sometimes all pointing in the same direction, other times having the distracter arrows pointing in the opposite direction to the target arrow. So this measures how well you can ignore distraction.

So this is where the findings get particularly interesting: it seems that looking at pictures of nature benefited executive control, but not alerting or orienting.

Why? Well, attention restoration theory posits that a natural environment gives your attentional abilities a chance to rest and restore themselves, because there are few elements that capture your attention and few requirements for directed attention. This is more obvious when you are actually present in these environments; it’s obvious that on a busy city street there will be far more things demanding your attention.

The fact that the same effect is evident even when you’re looking at pictures echoes, perhaps, recent findings that the same parts of the brain are activated when we’re reading about something or watching it or doing it ourselves. It’s another reminder that we live in our brains, not the world. (It does conjure up another intriguing notion: does the extent to which pictures are effective correlate with how imaginative the person is?)

It’s worth noting that mood also improved when the study participants walked in the park rather than along the streets, but this didn’t appear to be a factor in their improved cognitive performance; however, the degree to which they felt mentally refreshed did correlate with their performance. Confirming these results, mood wasn’t affected by viewing pictures of nature, but participants did report that such pictures were significantly more refreshing and enjoyable.

Now, I’ve just reported on a new study that seems to me to bear on this issue. The study compared brain activity when participants looked at images of the beach and the motorway. The researchers chose these contrasting images because they are associated with very similar sounds (the roar of waves is acoustically very similar to the roar of traffic), while varying markedly in the feelings evoked. The beach scenes evoke a feeling of tranquility; the motorway scenes do not.

I should note that the purpose of the researchers was to look at how a feeling (a sense of tranquility) could be evoked by visual and auditory features of the environment. They do not refer to the earlier work that I have been discussing, and the connection I am making between the two is entirely my own speculation.

But it seems to me that the findings of this study do provide some confirmation for the findings of the earlier study, and furthermore suggest that such natural scenes, whether because of the tranquility they evoke or their relatively low attention-demanding nature or some other reason, may improve attention by increasing synchronization between relevant brain regions.

I’d like to see these studies extended to older adults (both of them were small, and both involved young adults), and also to personality variables (do some individuals benefit more from such a strategy than others? Does reflect particular personality attributes?). I note that another study found reduced connectivity in the default mode network in older adults. The default mode network may be thought of as where your mind goes when it’s not thinking of anything in particular; the medial prefrontal cortex is part of the default mode network, and this is one of the reasons it was a focus of the most recent study.

In other words, perhaps natural scenes refresh the brain by activating the default mode network, in a particularly effective way, allowing your brain to subsequently return to action (“task-positive network”) with renewed vigor (i.e. nicely synchronized brainwaves).

Interestingly, another study has found a genetic component to default-mode connectivity (aberrant DMN connectivity is implicated in a number of disorders). It would be nice to see some research into the effect of natural scenes on attention in people who vary in this attribute.

Meditation is of course another restorative strategy, and I’d also like to see a head-to-head comparison of these two strategies. But in any case, bottom-line, these results do suggest an easy way of restoring fading attention, and because of the specific aspect of attention that is being helped, it suggests that the strategy may be of particular benefit to older adults. I would be interested to hear from any older adults who try it out.

[Note that part of this article first appeared in the December 2008 newsletter]

Benefits from fixed quiet points in the day

On my walk today, I listened to a downloaded interview from the On Being website. The interview was with ‘vocal magician and conductor’ Bobby McFerrin, and something he said early on in the interview really caught my attention.

In response to a question about why he’d once (in his teens) contemplated joining a monastic order, he said that the quiet really appealed to him, and also ‘the discipline of the hours … there’s a rhythm to the day. I liked the fact that you stopped whatever you were doing at a particular time and you reminded yourself, you brought yourself back to your calling’.

Those words resonated with me, and they made me think of the Moslem habit of prayer. Of the idea of having specified times during the day when you stop your ‘ordinary’ life, and touch base, as it were, with something that is central to your being.

I don’t think you need to be a monk or a Moslem to find value in such an activity! Nor does the activity need to be overtly religious.

Because this idea struck another echo in me — some time ago I wrote a brief report on how even a short ‘quiet time’ can help you consolidate your memories. It strikes me that developing the habit of having fixed points in the day when (if at all possible) you engage in some regular activity that helps relax you and center your thoughts, would help maintain your focus during the day, and give you a mental space in which to consolidate any new information that has come your way.

Appropriate activities could include:

  • meditating on your breath;
  • performing a t’ai chi routine;
  • observing nature;
  • listening to certain types of music;
  • singing/chanting some song/verse (e.g., the Psalms; the Iliad; the Tao te Ching)

Regarding the last two suggestions, as I reported in my book on mnemonics, there’s some evidence that reciting the Iliad has physiological effects on synchronizing heartbeat and breath that is beneficial for both mood and cognitive functioning. It’s speculated that the critical factor might be the hexametric pace (dum-diddy, dum-diddy, dum-diddy, dum-diddy, dum-diddy, dum-dum). Dactylic hexameter, the rhythm of classical epic, has a musical counterpart: 6/8 time.

Similarly, another small study found that singing Ave Maria in Latin, or chanting a yoga mantra, likewise affects brain blood flow, and the crucial factor appeared to be a rhythm that involved breathing at the rate of six breaths a minute.

Something to think about!

How working memory works: What you need to know

A New Yorker cartoon has a man telling his glum wife, “Of course I care about how you imagined I thought you perceived I wanted you to feel.” There are a number of reasons you might find that funny, but the point here is that it is very difficult to follow all the layers. This is a sentence in which mental attributions are made to the 6th level, and this is just about impossible for us to follow without writing it down and/or breaking it down into chunks.

According to one study, while we can comfortably follow a long sequence of events (A causes B, which leads to C, thus producing D, and so on), we can only comfortably follow four levels of intentionality (A believes that B thinks C wants D). At the 5th level (A wants B to believe that C thinks that D wants E), error rates rose sharply to nearly 60% (compared to 5-10% for all levels below that).

Why do we have so much trouble following these nested events, as opposed to a causal chain?

Let’s talk about working memory.

Working memory (WM) has evolved over the years from a straightforward “short-term memory store” to the core of human thought. It’s become the answer to almost everything, invoked for everything related to reasoning, decision-making, and planning. And of course, it’s the first and last port of call for all things memory — to get stored in long-term memory an item first has to pass through WM, where it’s encoded; when we retrieve an item from memory, it again passes through WM, where the code is unpacked.

So, whether or not the idea of working memory has been over-worked, there is no doubt at all that it is utterly crucial for cognition.

Working memory has also been equated with attentional control, and working memory and attention are often used almost interchangeably. And working memory capacity (WMC) varies among individuals. Those with a higher WMC have an obvious advantage in reasoning, comprehension, remembering. No surprise then that WMC correlates highly with fluid intelligence.

So let’s talk about working memory capacity.

The idea that working memory can hold 7 (+/-2) items has passed into popular culture (the “magic number 7”). More recent research, however, has circled around the number 4 (+/-1). Not only that, but a number of studies suggest that in fact the true number of items we can attend to is only one. What’s the answer? (And where does it leave our high- and low-capacity individuals? There’s not a lot of room to vary there.)

Well, in one sense, 7 is still fine — that’s the practical sense. Seven items (5-9) is about what you can hold if you can rehearse them. So those who are better able to rehearse and chunk will have a higher working memory capacity (WMC). That will be affected by processing speed, among other factors.

But there is a very large body of evidence now pointing to working memory holding only four items, and a number of studies indicating that most likely we can only pay attention to one of these items at a time. So you can envision this either as a focus of attention, which can only hold one item, and a slightly larger “outer store” or area of “direct access” which can hold another three, or as a mental space holding four items of which only one can be the focus at any one time.

A further tier, which may be part of working memory or part of long-term memory, probably holds a number of items “passively”. That is, these are items you’ve put on the back burner; you don’t need them right at the moment, but you don’t want them to go too far either. (See my recent news item for more on all this.)

At present, we don’t have any idea how many items can be in this slightly higher state of activation. However, the “magic number 7” suggests that you can circulate 3 (+/-1) items from the backburner into your mental space. In this regard, it’s interesting to note that, in the case of verbal material, the amount you can hold in working memory with rehearsal has been found to more accurately equate to 2 seconds, rather than 7 items. That is, you can remember as much as you can verbalize in about 2s (so, yes, fast speakers have a distinct advantage over slower ones). You see why processing speed affects WMC.

Whether you think of WM as a focus of one and an outer store of 3, or as a direct access area with 4 boxes and a spotlight shining on one, it’s a mental space or blackboard where you can do your working out. Thinking of it this way makes it easier to conceptualize and talk about, but these items are probably not going into a special area as such. The thought now is that these items stay in long-term memory (in their relevant areas of association cortex), but they are (a) highly activated, and (b) connected to the boxes in the direct access area (which is possibly in the medial temporal lobe). This connection is vitally important, as we shall see.

Now four may not seem like much, but WM is not quite as limited as it seems, because we have different systems for verbal (includes numerical) and visuospatial information. Moreover, we can probably distinguish between the items and the processing of them, which equates to a distinction between declarative and procedural memory. So that gives us three working memory areas: verbal declarative; visuospatial declarative; procedural.

Now all of this may seem more than you needed to know, but breaking down the working memory system helps us discover two things of practical interest. First, which particular parts of the system are the parts that make a task more difficult. Second, where individual differences come from, and whether they are in aspects that are trainable.

For example, this picture of a mental space with a focus of one and a maximum of three eager-beavers waiting their turn, points to an important aspect of the working memory system: switching the focus. Experiments reveal that there is a large focus-switching cost, incurred whenever you have to switch the item in the spotlight. And the extent of this cost has been surprising — around 240ms in one study, which is about six times the length of time it takes to scan an item in a traditional memory-search paradigm.

But focus-switch costs aren’t a constant. They vary considerably depending on the difficulty of the task, and they also tend to increase with each item in the direct-access area. Indeed, just having one item in the space outside the focus causes a significant loss of efficiency in processing the focused item.

This may reflect increased difficulty in discriminating one highly activated item from other highly activated items. This brings us to competition, which, in its related aspects of interference and inhibition, is a factor probably more crucial to WMC than whether you have 3 or 4 or 5 boxes in your direct access area.

But before we discuss that, we need to look at another important aspect of working memory: updating. Updating is closely related to focus-switching, and it’s easy to get confused between them. But it’s been said that working memory updating (WMU) is the only executive function that correlates with fluid intelligence, and updating deficits have been suggested as the reason for poor comprehension (also correlated with low-WMC). So it’s worth spending a little time on.

To get the distinction clear in your mind, imagine the four boxes and the spotlight shining on one. Any time you shift the spotlight, you incur a focus-switching cost. If you don’t have to switch focus, if you simply need to update the contents of the box you’re already focusing on, then there will be an update cost, but no focus-switching cost.

Updating involves three components: retrieval; transformation; substitution. Retrieval simply involves retrieving the contents from the box. Substitution involves replacing the contents with something different. Transformation involves an operation on the contents of the box to get a new value (eg, when you have to add a certain number to an earlier number).

Clearly the difficulty in updating working memory will depend on which of these components is involved. So which of these processes is most important?

In terms of performance, the most important component is transformation. While all three components contribute to the accuracy of updating, retrieval apparently doesn’t contribute to speed of updating. For both accuracy and speed, substitution is less important than transformation.

This makes complete sense: obviously having to perform an operation on the content is going to be more difficult and time-consuming than simply replacing it. But it does help us see that the most important factor in determining the difficulty of an updating task will be the complexity of the transformation.

The finding that retrieval doesn’t affect speed of updating sounds odd, until you realize the nature of the task used to measure these components. The number of items was held constant (always three), and the focus switched from one box to another on every occasion, so focus-switching costs were constant too. What the finding says is that once you’ve shifted your focus, retrieval takes no time at all — the spotlight is shining and there the answer is. In other words, there really is no distinction between the box and its contents when the spotlight is on it — you don’t need to open the box.

However, retrieval does affect accuracy, and this implies that something is degrading or interfering in some way with the contents of the boxes. Which takes us back to the problems of competition / interference.

But before we get to that, let’s look at this issue of individual differences, because like WMC, working memory updating correlates with fluid intelligence. Is this just a reflection of WMC?

Differences in transformation accuracy correlated significantly with WMC, as did differences in retrieval accuracy. Substitution accuracy didn’t vary enough to have measurable differences. Neither transformation nor substitution speed differences correlated with WMC. This implies that the reason why people with high WMC also do better at WMU tasks is because of the transformation and retrieval components.

So what about the factors that aren’t correlated with WMC? The variance in transformation speed is argued to primarily reflect general processing speed. But what’s going on in substitution that isn’t going on in when WMC is measured?

Substitution involves two processes: removing the old contents of the box, and adding new content. In terms of the model we’ve been using, we can think of unbinding the old contents from the box, and binding new contents to it (remember that the item in the box is still in its usual place in the association cortex; it’s “in” working memory by virtue of the temporary link connecting it to the box). Or we can think of it as deleting and encoding.

Consistent with substitution not correlating with WMC, there is some evidence that high- and low-WMC individuals are equally good at encoding. Where high- and low-WMC individuals differ is in their ability to prevent irrelevant information being encoded with the item. Which brings me to my definition of intelligence (from 30 years ago — these ideas hadn’t even been invented yet. So I came at it from quite a different angle): the ability to (quickly) select what’s important.

So why do low-WMC people tend to be poorer at leaving out irrelevant information?

Well, that’s the $64,000 question, but related to that it’s been suggested that those with low working memory capacity are less able to resist capture by distracting stimuli than those with high WMC. A new study, however, provides evidence that low- and high-WMC individuals are equally easily captured by distracters. What distinguishes the two groups is the ability to disengage. High-capacity people are faster in putting aside irrelevant stimuli. They’re faster at deleting. And this, it seems, is unrelated to WMC.

This is supported by another recent finding, that when interrupted, older adults find it difficult to disengage their brain from the new task and restore the original task.

So what’s the problem with deleting / removing / putting aside items in focus? This is about inhibition, which takes us once again to competition / interference.

Now interference occurs at many different levels: during encoding, retrieval, and storage; with items, with tasks, with responses. Competition is ubiquitous in our brain.

In the case of substitution during working memory updating, it’s been argued that the contents of the box are not simply removed and replaced, but instead gradually over-written by the new contents. This fits in with a view of items as assemblies of lower-level “feature-units”. Clearly, items may share some of these units with other items (reflected in their similarity), and clearly the more they compete for these units, the greater interference there will be between the units.

You can see why it’s better to keep your codes (items) “lean and mean”, free of any irrelevant information.

Indeed, some theorists completely discard the idea of number of items as a measure of WMC, and talk instead in terms of “noise”, with processing capacity being limited by such factors as item complexity and similarity. While there seems little justification for discarding our “4+/-1”, which is much more easily quantified, this idea does help us get to grips with the concept of an “item”.

What is an item? Is it “red”? “red cow”? “red cow with blue ribbons round her neck”? “red cow with blue ribbons and the name Isabel painted on her side”? You see the problem.

An item is a fuzzy concept. We can’t say, “it’s a collection of 6 feature units” (or 4 or 14 or 42). So we have to go with a less defined description: it’s something so tightly bound that it is treated as a single unit.

Which means it’s not solely about the item. It’s also about you, and what you know, and how well you know it, and what you’re interested in.

To return to our cases of difficulty in disengaging, perhaps the problem lies in the codes being formed. If your codes aren’t tightly bound, then they’re going to start to degrade, losing some of their information, losing some of their distinctiveness. This is going to make them harder to re-instate, and it’s going to make them less distinguishable from other items.

Why should this affect disengagement?

Remember what I said about substitution being a gradual process of over-writing? What happens when your previous focus and new focus have become muddled?

This also takes us to the idea of “binding strength” — how well you can maintain the bindings between the contents and their boxes, and how well you can minimize the interference between them (which relates to how well the items are bound together). Maybe the problem with both disengagement and reinstatement has to do with poorly bound items. Indeed, it’s been suggested that the main limiting factor on WMC is in fact binding strength.

Moreover, if people vary in their ability to craft good codes, if people vary in their ability to discard the irrelevant and select the pertinent, to bind the various features together, then the “size” (the information content) of an item will vary too. And maybe this is what is behind the variation in “4 +/-1”, and experiments which suggest that sometimes the focus can be increased to 2 items. Maybe some people can hold more information in working memory because they get more information into their items.

So where does this leave us?

Let’s go back to our New Yorker cartoon. The difference between a chain of events and the nested attributions is that chaining doesn’t need to be arranged in your mental space because you don’t need to keep all the predecessors in mind to understand it. On the other hand, the nested attributions can’t be understood separately or even in partitioned groups — they must all be arranged in a mental space so we can see the structure.

We can see now that “A believes that B thinks C wants D” is easy to understand because we have four boxes in which to put these items and arrange them. But our longer nesting, “A wants B to believe that C thinks that D wants E”, is difficult because it contains one more item than we have boxes. No surprise there was a dramatic drop-off in understanding.

So given that you have to fill your mental space, what is it that makes some tasks more difficult than others?

  • The complexity and similarity of the items (making it harder to select the relevant information and bind it all together).
  • The complexity of the operations you need to perform on each item (the longer the processing, the more tweaking you have to do to your item, and the more time and opportunity for interference to degrade the signal).
  • Changing the focus (remember our high focus-switching costs).

But in our 5th level nested statement, the error rate was 60%, not 100%, meaning a number of people managed to grasp it. So what’s their secret? What is it that makes some people better than others at these tasks?

They could have 5 boxes (making them high-WMC). They could have sufficient processing speed and binding strength to unitize two items into one chunk. Or they could have the strategic knowledge to enable them to use the other WM system (transforming verbal data into visuospatial). All these are possible answers.


This has been a very long post, but I hope some of you have struggled through it. Working memory is the heart of intelligence, the essence of attention, and the doorway to memory. It is utterly critical, and cognitive science is still trying to come to grips with it. But we’ve come a very long way, and I think we now have sufficient theoretical understanding to develop a model that’s useful for anyone wanting to understand how we think and remember, and how they can improve their skills.

There is, of course, far more that could be said about working memory (I’ve glossed over any number of points in an effort to say something useful in less than 50,000 words!), and I’m planning to write a short book on working memory, its place in so many educational and day-to-day tasks, and what we can do to improve our skills. But I hope some of you have found this enlightening.

References

Clapp, W. C., Rubens, M. T., Sabharwal, J., & Gazzaley, A. (2011). Deficit in switching between functional brain networks underlies the impact of multitasking on working memory in older adults. Proceedings of the National Academy of Sciences. doi:10.1073/pnas.1015297108

Ecker, U. K. H., Lewandowsky, S., Oberauer, Klaus, & Chee, A. E. H. (2010). The Components of Working Memory Updating : An Experimental Decomposition and Individual Differences. Cognition, 36(1), 170 -189. doi: 10.1037/a0017891.

Fukuda, K., & Vogel, E. K. (2011). Individual Differences in Recovery Time From Attentional Capture. Psychological Science, 22(3), 361 -368. doi:10.1177/0956797611398493

Jonides, J., Lewis, R. L., Nee, D. E., Lustig, C. a, Berman, M. G., & Moore, K. S. (2008). The mind and brain of short-term memory. Annual review of psychology, 59, 193-224. doi: 10.1146/annurev.psych.59.103006.093615.

Kinderman, P., Dunbar, R.I.M. & Bentall, R.P. (1998).Theory-of-mind deficits and causal attributions. British Journal of Psychology 89: 191-204.

Lange, E. B., & Verhaeghen, P. (in press). No age differences in complex memory search: Older adults search as efficiently as younger adults. Psychology and Aging.

Oberauer, K, Sus, H., Schulze, R., Wilhelm, O., & Wittmann, W. (2000). Working memory capacity — facets of a cognitive ability construct. Personality and Individual Differences, 29(6), 1017-1045. doi: 10.1016/S0191-8869(99)00251-2.

Oberauer, K. (2005). Control of the Contents of Working Memory--A Comparison of Two Paradigms and Two Age Groups. Journal of Experimental Psychology: Learning, Memory, and Cognition, 31(4), 714-728. doi:10.1037/0278-7393.31.4.714

Oberauer, Klaus. (2006). Is the Focus of Attention in Working Memory Expanded Through Practice ? Cognition, 32(2), 197-214. doi: 10.1037/0278-7393.32.2.197.

Oberauer, Klaus. (2009). Design for a Working Memory. Psychology of Learning and Motivation, 51, 45-100.

Verhaeghen, P., Cerella, J. & Basak, C. (2004) A Working Memory Workout : How to Expand the Focus of Serial Attention From One to Four Items in 10 Hours or Less. Cognition, 30 (6), 1322-1337.

Achieving flow

I’ve recently had a couple of thoughts about flow — that mental state when you lose all sense of time and whatever you’re doing (work, sport, art, whatever) seems to flow with almost magical ease. I’ve mentioned flow a couple of times more or less in passing, but today I want to have a deeper look, because learning (and perhaps especially that rewiring I was talking about in my last post) is most easily achieved if we can achieve "flow" (also known as being ‘in the zone’).

Let’s start with some background.

Mihaly Csikszentmihalyi is the man who identified and named this mental state, and he identified 9 components:

  1. The skills you need to perform the task must match the challenges of the task, AND the task must exceed a certain level of difficulty (above everyday level).
  2. Your concentration is such that your behavior becomes automatic and you have little conscious awareness of your self, only of what you’re doing.
  3. You have a very clear sense of your goals.
  4. The task provides unambiguous and immediate feedback concerning your progress toward those goals.
  5. Your focus is entirely on the task and you are completely unaware of any distracting events.
  6. You feel in control, but paradoxically, if you try to consciously hold onto that control, you’ll lose that sense of flow. In other words, you only feel in control as long as you don’t think about it.
  7. You lose all sense of self and become one with the task.
  8. You lose all sense of time.
  9. You experience what Csikszentmihalyi called the ‘autotelic experience’ (from Greek auto (self) and telos (goal)), which is inherently rewarding, providing the motivation to re-experience it.

Clearly many of these components are closely related. More usefully, we can distinguish between elements of the experience, and preconditions for the experience.

The key elements of the experience are your total absorption in the task (which leads to you losing all awareness of self, of time, and any distractions in the environment), and your enjoyment of it.

The key preconditions are:

  • the match between skills and task
  • the amount of challenge in the task
  • the clear and proximal nature of your goals (that is, at least some need to be achievable in that session)
  • the presence of useful feedback.

Additionally, later research suggests:

  • the task needs to be high in autonomy and meaningfulness.

Brain studies have found that this mental state is characterized by less activity in the prefrontal cortex (which provides top-down control — including that evidenced by that critical inner voice), and a small increase in alpha brainwaves (correlated with slower breathing and a lower pulse rate). This inevitably raises the question of whether meditation training can help you more readily achieve flow. Supporting this, a neurofeedback study improved performance in novice marksmen, who learned to shoot expertly in less than half the time after they had been trained to produce alpha waves. There are also indications that some forms of mild electrical stimulation to the brain (tDCS) can induce a flow state.

Some people may be more prone to falling into a flow state than others. Csikszentmihalyi referred to an ‘autotelic personality’, and suggested that such people have high levels of curiosity, persistence, and interest in performing activities for their own sake rather than to achieve some external goal. Readers of my books may be reminded of cognitive styles — those who are intrinsically motivated rather than extrinsically usually are more successful in study.

Recent research has supported the idea of the autotelic personality, and roots it particularly in the achievement motive. Those who have a strong need for achievement, and a self-determined approach, are more likely to experience flow. Such people also have a strong internal locus of control — that is, they believe that achievement rests in their own hands, in their own work and effort. I have, of course, spoken before of the importance of this factor.

There is some indication that autotelic students push themselves harder. A study of Japanese students found that autotelic students tended to put themselves in situations where the perceived challenges were higher than their perceived skills, while the reverse was true for other students.

Interestingly, a 1994 study found that college students perceived work where skills exceeded challenges to be more enjoyable than flow activities where skills matched challenges — which suggests, perhaps, that we are all inclined to underestimate our own skills, and do better when pushed a little.

In regard to occupation, research suggests that five job characteristics are positively related to flow at work. These characteristics (which come from the Job Characteristics Model) are:

  • Skill variety

  • Task identity (the extent to which you complete a whole and identifiable piece of work)

  • Task significance

  • Autonomy

  • Feedback

These clearly echo the flow components.

All of this suggests that to consistently achieve a flow state, you need the right activities and the right attitude.

So, that’s the background. Now for my new thoughts. It occurred to me that flow might have something to do with working memory. I’ve suggested before that flow might have something to do with getting the processing speed just right. My new thought extends this idea.

Remember that working memory is extremely limited, and that it seems to reflect a three-tiered system, whereby you have one item in your immediate focus, with perhaps three more items hovering very closely within an inner store, able to very quickly move into immediate focus, and a further three or so items in the ‘backburner’ — and all these items have to keep moving around and around these tiers if you want to keep them all ‘alive’. Because they can’t stay very long at all in this system without being refreshed (through the focus).

Beyond this system is the huge database of your long-term memory, and that’s where all these items come from. Thus, whenever you’re working on something, you’re effectively circulating items through this whole four-tier system: long-term memory to focus to inner store to backburner and then returning to LTM or to focus. And returning to LTM is the default — if it’s to return to focus, it has to happen within a very brief period of time.

And so here’s my thesis (I don’t know if it’s original; I just had the idea this morning): flow is our mental experience of a prolonged period of balancing this circulation perfectly. Items belonging to one cohesive structure are flowing through the system at the right speed and in the right order, with no need to stop and search, and no room for any items that aren’t part of this cohesive structure (i.e., there are no slots free in which to experience any emotions or distracting thoughts).

What this requires is for the necessary information to all be sufficiently strongly connected, so that activation/retrieval occurs without delay. And what that requires is for the foundations to be laid. That is, you need to have the required action sequences or information clusters well-learned.

Here we have a mechanism for talent — initial interest and some skill produces a sense of flow; this motivating state is pursued by the individual by persevering at the same activity/subject; if they are not pushed too hard (which will not elicit flow), or held back (ditto), they will once again achieve the desired state, increasing the motivation to pursue this course. And so on.

All of which begs the question: are autotelic personalities created or made? Because the development of people who find it easier to achieve flow may well have more to do with their good luck in childhood (experiencing the right support) than their genetic makeup.

Is flow worth pursuing? Flow helps us persist at a task, because it is an intrinsically rewarding mental state. Achieving flow, then, is likely to result in greater improvement if only because we are likely to spend more time on the activity. The interesting question is whether it also, in and of itself, means we gain more from the time we spend. At the moment, we can only speculate.

But research into the value of mental stimulation in slowing cognitive decline in older people indicates that engagement, and its correlate enjoyment, are important if benefits are to accrue. I think the experience of flow is not only intrinsically rewarding, but also intrinsically beneficial in achieving the sort of physical brain changes we need to fight age-related cognitive decline.

So I’ll leave you with the findings from a recent study of flow in older adults, that has some helpful advice for anyone wanting to achieve flow, as well as demonstrating that you're never too old to achieve this state (even if it does seem harder to achieve as you age, because of the growing difficulty in inhibiting distraction).

The study, involving 197 seniors aged 60-94, found that those with higher fluid cognitive abilities (processing speed, working memory, visual spatial processing, divergent thinking, inductive reasoning, and everyday problem-solving) experienced higher levels of flow in cognitive activities, while those with lower fluid abilities experienced lower levels of flow. However, those with lower fluid abilities experienced higher levels of flow in non-cognitive activities, while those with higher fluid abilities experienced lower levels of flow.

High cognitive demand activities included: working, art and music, taking classes and teaching, reading, puzzles and games, searching for information. Low cognitive demand activities included: social events, exercise, TV, cooking, going on vacation. Note that the frequency of these activities did not differ between those of higher fluid ability and those of lower.

These findings reinforce the importance of matching skills and activities in order to achieve flow, and also remind us that flow can be achieved in any activity.

Taking things too seriously

I was listening to a podcast the other day. Two psychologists (Andrew Wilson and Sabrina Galonka) were being interviewed about embodied cognition, a topic I find particularly interesting. As an example of what they meant by embodied cognition (something rather more specific than the fun and quirky little studies that are so popular nowadays — e.g., making smaller estimations of quantities when leaning to the left; squeezing a soft ball making it more likely that people will see gender neutral faces as female while squeezing a hard ball influences them to see the faces as male; holding a heavier clipboard making people more likely to judge currencies as more valuable and their opinions and leaders as more important), they mentioned the outfielder problem. Without getting into the details (if you’re interested, the psychologists have written a good article on it on their blog), here’s what I took away from the discussion:

We used to think that, in order to catch a ball, our brain was doing all these complex math- and physics-related calculations — try programming a robot to do this, and you’ll see just how complex the calculations need to be! And of course this is that much more complicated when the ball isn’t aimed at you and is traveling some distance (the outfielder problem).

Now we realize it’s not that complicated — our outfielder is moving, and this is the crucial point. Apparently (according to my understanding), if he moves at the right speed to make his perception of the ball’s speed uniform (the ball decelerates as it goes up, and accelerates as it comes down, so the catcher does the inverse: running faster as the ball rises and slower as it falls), then — if he times it just right — the ball will appear to be traveling a straight line, and the mental calculation of where it will be is simple.

(This, by the way, is what these psychologists regard as ‘true’ embodied cognition — cognition that is the product of a system that includes the body and the environment as well as the brain.)

This idea suggests two important concepts that are relevant to those wishing to improve their memory:

We (like all animals) have been shaped by evolution to follow the doctrine of least effort. Mental processing doesn’t come cheap! If we can offload some of the work to other parts of the system, then it’s sensible to do so.

In other words, there’s no great moral virtue in insisting on doing everything mentally. Back in the day (2,500 odd years ago), it was said that writing things down would cause people to lose their ability to remember (in Plato’s Phaedrus, Socrates has the Egyptian god-pharaoh say to Thoth, the god who invented writing, “this discovery of yours will create forgetfulness in the learners' souls, because they will not use their memories; they will trust to the external written characters and not remember of themselves.”)

This idea has lingered. Many people believe that writing reminders to oneself, or using technology to remember for us, ‘rots our brains’ and makes us incapable of remembering for ourselves.

But here’s the thing: the world is full of information. And it is of varying quality and importance. You might feel that someone should be remembering certain information ‘for themselves’, but this is a value judgment, not (as you might believe) a helpful warning that their brain is in danger of atrophying itself into terminal dysfunction. The fact is, we all choose what to remember and what to forget — we just might not have made a deliberate and conscious choice. Improving your memory begins with this: actually thinking about what you want to remember, and practicing the strategies that will help you do just that.

However, there’s an exception to the doctrine of least effort, and it’s evident among all the animals with sufficient cognitive power — fun. All of us who have enough brain power to spare, engage in play. Play, we are told, has a serious purpose. Young animals play to learn about the world and their own capabilities. It’s a form, you might say, of trial-&-error — but a form with enjoyability built into the system. This enjoyability is vital, because it motivates the organism to persist. And persistence is how we discover what works, and how we get the practice to do it well.

What distinguishes a good outfielder from someone who’s never tried to catch a ball before? Practice. To judge the timing, to get the movement just right — movement which will vary with every ball — you need a lot of practice. You can’t just read about what to do. And that’s true of every physical skill. Less obviously, it’s true of cognitive skills also.

It also ties back to what I was saying about trying to achieve flow. If you’re not enjoying what you’re doing, it’s probably either too easy or too hard for you. If it’s too easy, try and introduce some challenge into it. If it’s too hard, break it down into simpler components and practice them until you have achieved a higher level of competence on them.

Enjoyability is vital for learning well. So don’t knock fun. Don’t think play is morally inferior. Instead, try and incorporate a playful element into your work and study (there’s a balance, obviously!). If you have hobbies you enjoy, think about elements you can carry across to other activities (if you don’t have a hobby you enjoy, perhaps you should start by finding one!).

So the message for today is: the holy grail in memory and learning is NOT to remember everything; the superior approach to work / study / life is NOT total mastery and serious dedication. An effective memory is one that remembers what you want/need it to remember. Learning occurs through failure. Enjoyability greases the path to the best learning and the most effective activity.

Let focused fun be your mantra.

Seeing without words

I was listening on my walk today to an interview with Edward Tufte, the celebrated guru of data visualization. He said something I took particular note of, concerning the benefits of concentrating on what you’re seeing, without any other distractions, external or internal. He spoke of his experience of being out walking one day with a friend, in a natural environment, and what it was like to just sit down for some minutes, not talking, in a very quiet place, just looking at the scene. (Ironically, I was also walking in a natural environment, amidst bush, beside a stream - but I was busily occupied listening to this podcast!)

Tufte talked of how we so often let words get between us and what we see. He spoke of a friend who was diagnosed with Alzheimer’s, and how whenever he saw her after that, he couldn’t help but be watchful for symptoms, couldn’t help interpreting everything she said and did through that perspective.

There are two important lessons here. The first is a reminder of how most of us are always rushing to absorb as much information as we can, as quickly as we can. There is, of course, an ocean of information out there in the world, and if we want to ‘keep up’ (a vain hope, I fear!), we do need to optimize our information processing. But we don’t have to do that all the time, and we need to be aware that there are downsides to that attitude.

There is, perhaps, an echo here with Kahnemann’s fast & slow thinking, and another to the idea that quiet moments of reflection during the day can bring cognitive benefits.

In similar vein, then, we’d probably all find a surprising amount of benefit from sometimes taking the time to see something familiar as if it was new — to sit and stare at it, free from preconceptions about what it’s supposed to be or supposed to tell us. A difficult task at times, but if you try and empty your mind of words, and just see, you may achieve it.

The second lesson is more specific, and applies to all of us, but perhaps especially to teachers and caregivers. Sometimes you need to be analytical when observing a person, but if you are interacting with someone who has a label (‘learning-disabled’, ‘autistic’, ‘Alzheimer’s’, etc), you will both benefit if you can sometimes see them without thinking of that label. Perhaps, without the preconception of that label, you will see something unexpected.

Improving attention

Forget the persistent myth that everything is remembered; that our brains are video cameras whirring away recording everything, and that such 'hidden' knowledge can be brought to light by a hypnotist or alien artefact. Such things are the stuff of fantasy. Of course, there is a nugget of truth there: we can, and do, remember things we've paid no conscious attention to. Sometimes the right question can elicit memories we didn't know we had, in more detail than we imagined we could have. But for the most part, what's not noticed is not remembered. Attention is crucial to memory.

In particular, attention is crucial to good encoding. That is, the construction of memories that will be easily accessed.

In study, of course, we become especially aware of the connection between attention and memory. That's because learning is all about the deliberate construction of accessible memories.

But attention is somewhat of a bugbear: we all recognize its importance, but improving it is no easy task. Nor does research have as much to offer as it might. There are no quick and easy 'fixes' to failing concentration, to the difficulties of focusing on your work when your mind is full of other things.

Here's the most important thing to know when it comes to understanding attention: Attention and working memory are inextricably entwined. Indeed, it's thought that your working memory capacity reflects the extent to which you can control your attention, particularly in situations where there is competing information or competing demands.

In other words, the undeniable differences between people’s working memory capacity are not so much because people differ in how much information they can keep active, but because they vary in their ability to control attention.

Controlling attention has two main aspects:

  • your ability to focus on one thing
  • your ability to ignore distracting and irrelevant information.

It now seems likely that an erosion in the ability to ignore distraction is the principal reason for the cognitive decline so often seen with age.

Your ability to ignore distraction is also challenged by other circumstances, such as stress and anxiety, sleep deprivation, busy environments.

Improving your attention, then, is a complex task, that should be approached from multiple directions: