“Busy Bee” paper published in Experimental Brain Research

We have a paper published this month in the journal Experimental Brain Research based upon 2 years of data collected at the Lincoln Summer Scientist event which looked at how children’s saccadic eye movements are affected by directional socio-biological cues (see previous posts here and here).

We report results from 137 children who performed a pro-saccade task presented as  a computer game in which they had to keep their eyes on a cartoon bee that jumped unpredictably from the middle to the left or right of a computer screen. The Eyelink II system was used to examine how quickly and accurately the children followed the bee while pictures of arrows and photos of pointing hands and eyes appeared in middle the screen just before the “buzy bee” character moved (see Youtube video).busy bee task

We found that children were distracted by the direction of the pointing pictures such that their eyes were quicker to move towards the cartoon bee when she jumped in the same (Congruent) as opposed to the opposite (Incongruent) direction to the pointing finger, eye or arrow. Interestingly for the youngest group of children (3-5 years) this effect was found most strongly for pointing fingers. Only older children showed the effect for eyes and arrows. The paper makes the case for the view that children have to learn to link what they see in the world around them with the direction of interesting information and events. One of the first “cues” to attention that young children learn may be the direction of an adult’s pointing index finger.

Another interesting finding of the work was that for the youngest group of children when eye gaze cues overlapped with the onset of the peripheral target bee a large proportion of “omission errors” were made such that they missed the target completely and didn’t make a saccade. I was involved in the testing myself at Summer Scientist and I found this feature of young children’s behaviour particularly fascinating. It seems strikingly similar to stimulus “extinction” and ommission errorsneglect seen in adult stroke patients. Rather than just not moving their eyes I think 3-4 year olds didn’t “see” the Bee under these conditions and this is something I’d hope to follow up at future summer Scientist weeks.

The paper is full open access and available here . See here for another recent Experimental Brain Research study I’ve only recently seen by a group in Oslo showing ERP response to finger pointing cues in babies.

“Functional Pleomorphism” and the Neuroscience of Rules

Last month we published a paper in Frontiers of Human Neuroscience. This was the latest and last functional magnetic imaging (fMRI) study to come out of a Wellcome Trust funded project run by myself (whilst at University of Exeter) and Dr Ben Parris (now at Bournemouth), investigating the organisation and function of cognitive control processes in the human frontal cortex.pic2

Humans are perhaps uniquely able to represent information in an abstract way which allows them to generalise rules and knowledge across different situations and tasks. For example we might learn to get a reward by pressing a button on the left with our finger when we see a blue shape, and a button on the right when we see a yellow shape. But we can also generalise rules to different situations e.g. instead of pressing a left or right button,make an eye movement to the left or right when we see the right colour…or say the word “left” or “right”…or wiggle the toes on your left and right feet!

This problem might seem trivial at first, but it is actually very difficult to see how neurons in the brain could achieve this, when ultimately they just make connections (“synapses”) linking inputs (a sensory stimulus) with outputs (a specific muscle movement). Imagine a neuron that truly represented the abstract concept of Blue things =Leftness. The only way this clever “concept neuron” could make us contract just our left interosseous muscle (the muscle in your index finger), or just our lateral extraocular muscle would be to have synapses that change their configuration completely within a fraction of a second so that they just sent signals down the connections to the finger or the eye muscles. Otherwise the signals from the neuron would either do nothing or have to make our whole body (including our toes!) move left. The idea that cells can change their synapses very rapidly (<1 second) is called neural functional pleomorphism and we don’t think it happens in the human brain.

In our study we got participants to perform a Blue / Yellow colour – Left/Right response rule switching task in the brain scanner. Sometimes they had to make just an Eye movement and other times just a left/right button press. We found areas of the brain that responded just for Eyes and just for Hands as you might expect. But other areas in the prefrontal cerebral cortex didn’t care about the response and showed activity more related to the rules. But when we zoomed in on these same areas and analysed the pattern of activity at a finer scale during Hand and Eye “epochs”, there were significant differences between the two. So whats going on?







The answer, we think, is that there aren’t any “concept neurons” in your brain. Instead rules and concepts are embedded in the distributed pattern of activity across networks of neurons. Some of these individual cells might prefer eye movements to the right when we see a Blue Square, others may send signals to our left finger when we see a Blue Circle and others which make connections relating to all sorts of different versions and combinations of the general rule: blue things=left. When all the neurons representing the various different examples of the general rule work together, that is when we experience ourselves thinking about the general rule concept. But in order to actually do anything useful only the sub-set of neurons coding one specific “exemplar” of the rule is active.

pic3So there is no single place in our brain where task rules can be said to be represented and exist!

The actual paper is less philosophical but it is Open Access and you can download it here

Quality Adjusted Life Years and the Neuroscience of Fairness

This month we published the first ever fMRI brain imaging study of health care rationing decision making 


decision making screens

Health care funders around the world have to make difficult moral decisions to allocate limited money to treat various medical conditions in different groups of patients. For example whether or not to support funding for expensive new drug treatments for a group of cancer patients with low chance of survival relative to a drug that will benefit a group with a higher chance of survival.  A commonly used framework on which these decisions can be based are Quality Adjusted Life Years or “QALYs”. This system is based on so-called utilitarian decision making principles, which prioritize choices that deliver the maximum benefit to the greatest number of people. The problem with QALYs  is that decisions based on this approach are often viewed negatively by members of the general public. People instead believe that everybody has a “right” to receive medical care and anything that violates this principle is unfair and immoral.

Consistent with previous work, when asked to judge the “fairness” of various scenarios depicting a split between different social groups our participants judged unequal division of funding as “unfair”, even when principles of QALY might indicate otherwise. Interestingly brain regions linked to emotion as well as cognitive processes were active during decision making. Unequal splits of resources for medical care produced activity in the anterior insula, a region often associated with social / moral disgust (see earlier posts on social norms and eating disorders). Further, under conditions where participants were prepared to judge unequal splits as fair, more activity was seen in the inferior frontal cortex, a region activated when humans inhibit a strong response impulse.

The results represent a preliminary first step for cognitive neuroscience into the field of health economics and the paper is careful to avoid over interpreting the findings and applying them to real situations outside the scanner. But the findings are consistent with a bigger idea that humans have two decision making systems, one cognitive and one more emotional / instinctive. Given enough information people may be more inclined to support healthcare decisions based on QALYs, but this requires cognitive effort to over-ride a more emotion based bias towards absolute equality and universal rights.

The paper is out  ipic2n the June edition of the Journal of Neuroscience Psychology and Economics. The research was carried out in collaboration with Prof. Paul Anand ( Open University and Health Economics Research Centre at the University of Oxford), Lisa Smith (Flinders University Austrailia) and the Exeter Magnetic Resonance Research Centre. A pre-print of the paper is available via the Lincoln Repository website.

Lincoln Triathlon for Parkinsons UK

In a couple of weeks time I’m aiming to complete the David Lloyd Lincoln Sprint Triathlon in benefit of Parkinsons UK. You can sponsor me here

Much of my published research over the years has looked at how  eye movements are affected in Parkinsons during cognitively PDdemanding tasks such as problem solving, rule learning and task switching (see earlier post). In the long term its possible my research could help to develop tests to improve earlier detection of the condition. But what research papers can’t get across is what amazingly nice people Parkinsons patients are  and how positive they are about helping with research.

Because of this I wanted to make at least a token effort to raise awareness and provide a direct benefit by doing my Triathlon in aid of Parkinsons UK.

Eye movements in the real world


We currently have a fully funded studentship opportunity available to examine how we direct attention to social cues in the real world. Supervised by Dr Frouke Hermens and myself the project will involve using a mobile eye tracking system. The work builds on earlier work by Frouke, myself and my former PhD student Nicola Gregory (now at Bournemouth) looking at how socio-biological cues such as eye gaze direction and pointing finger cues direct attention in an automatic way.


See here for more details of the project and earlier posts on my work with Nicola on socio-biological cueing here