“Functional Pleomorphism” and the Neuroscience of Rules

Last month we published a paper in Frontiers of Human Neuroscience. This was the latest and last functional magnetic imaging (fMRI) study to come out of a Wellcome Trust funded project run by myself (whilst at University of Exeter) and Dr Ben Parris (now at Bournemouth), investigating the organisation and function of cognitive control processes in the human frontal cortex.pic2

Humans are perhaps uniquely able to represent information in an abstract way which allows them to generalise rules and knowledge across different situations and tasks. For example we might learn to get a reward by pressing a button on the left with our finger when we see a blue shape, and a button on the right when we see a yellow shape. But we can also generalise rules to different situations e.g. instead of pressing a left or right button,make an eye movement to the left or right when we see the right colour…or say the word “left” or “right”…or wiggle the toes on your left and right feet!

This problem might seem trivial at first, but it is actually very difficult to see how neurons in the brain could achieve this, when ultimately they just make connections (“synapses”) linking inputs (a sensory stimulus) with outputs (a specific muscle movement). Imagine a neuron that truly represented the abstract concept of Blue things =Leftness. The only way this clever “concept neuron” could make us contract just our left interosseous muscle (the muscle in your index finger), or just our lateral extraocular muscle would be to have synapses that change their configuration completely within a fraction of a second so that they just sent signals down the connections to the finger or the eye muscles. Otherwise the signals from the neuron would either do nothing or have to make our whole body (including our toes!) move left. The idea that cells can change their synapses very rapidly (<1 second) is called neural functional pleomorphism and we don’t think it happens in the human brain.

In our study we got participants to perform a Blue / Yellow colour – Left/Right response rule switching task in the brain scanner. Sometimes they had to make just an Eye movement and other times just a left/right button press. We found areas of the brain that responded just for Eyes and just for Hands as you might expect. But other areas in the prefrontal cerebral cortex didn’t care about the response and showed activity more related to the rules. But when we zoomed in on these same areas and analysed the pattern of activity at a finer scale during Hand and Eye “epochs”, there were significant differences between the two. So whats going on?

pic1

 

 

 

 

 

The answer, we think, is that there aren’t any “concept neurons” in your brain. Instead rules and concepts are embedded in the distributed pattern of activity across networks of neurons. Some of these individual cells might prefer eye movements to the right when we see a Blue Square, others may send signals to our left finger when we see a Blue Circle and others which make connections relating to all sorts of different versions and combinations of the general rule: blue things=left. When all the neurons representing the various different examples of the general rule work together, that is when we experience ourselves thinking about the general rule concept. But in order to actually do anything useful only the sub-set of neurons coding one specific “exemplar” of the rule is active.

pic3So there is no single place in our brain where task rules can be said to be represented and exist!

The actual paper is less philosophical but it is Open Access and you can download it here

Quality Adjusted Life Years and the Neuroscience of Fairness

This month we published the first ever fMRI brain imaging study of health care rationing decision making 

pic1

decision making screens

Health care funders around the world have to make difficult moral decisions to allocate limited money to treat various medical conditions in different groups of patients. For example whether or not to support funding for expensive new drug treatments for a group of cancer patients with low chance of survival relative to a drug that will benefit a group with a higher chance of survival.  A commonly used framework on which these decisions can be based are Quality Adjusted Life Years or “QALYs”. This system is based on so-called utilitarian decision making principles, which prioritize choices that deliver the maximum benefit to the greatest number of people. The problem with QALYs  is that decisions based on this approach are often viewed negatively by members of the general public. People instead believe that everybody has a “right” to receive medical care and anything that violates this principle is unfair and immoral.

Consistent with previous work, when asked to judge the “fairness” of various scenarios depicting a split between different social groups our participants judged unequal division of funding as “unfair”, even when principles of QALY might indicate otherwise. Interestingly brain regions linked to emotion as well as cognitive processes were active during decision making. Unequal splits of resources for medical care produced activity in the anterior insula, a region often associated with social / moral disgust (see earlier posts on social norms and eating disorders). Further, under conditions where participants were prepared to judge unequal splits as fair, more activity was seen in the inferior frontal cortex, a region activated when humans inhibit a strong response impulse.

The results represent a preliminary first step for cognitive neuroscience into the field of health economics and the paper is careful to avoid over interpreting the findings and applying them to real situations outside the scanner. But the findings are consistent with a bigger idea that humans have two decision making systems, one cognitive and one more emotional / instinctive. Given enough information people may be more inclined to support healthcare decisions based on QALYs, but this requires cognitive effort to over-ride a more emotion based bias towards absolute equality and universal rights.

The paper is out  ipic2n the June edition of the Journal of Neuroscience Psychology and Economics. The research was carried out in collaboration with Prof. Paul Anand ( Open University and Health Economics Research Centre at the University of Oxford), Lisa Smith (Flinders University Austrailia) and the Exeter Magnetic Resonance Research Centre. A pre-print of the paper is available via the Lincoln Repository website.

Lincoln Triathlon for Parkinsons UK

In a couple of weeks time I’m aiming to complete the David Lloyd Lincoln Sprint Triathlon in benefit of Parkinsons UK. You can sponsor me here

Much of my published research over the years has looked at how  eye movements are affected in Parkinsons during cognitively PDdemanding tasks such as problem solving, rule learning and task switching (see earlier post). In the long term its possible my research could help to develop tests to improve earlier detection of the condition. But what research papers can’t get across is what amazingly nice people Parkinsons patients are  and how positive they are about helping with research.

Because of this I wanted to make at least a token effort to raise awareness and provide a direct benefit by doing my Triathlon in aid of Parkinsons UK.

Eye movements in the real world


eye

We currently have a fully funded studentship opportunity available to examine how we direct attention to social cues in the real world. Supervised by Dr Frouke Hermens and myself the project will involve using a mobile eye tracking system. The work builds on earlier work by Frouke, myself and my former PhD student Nicola Gregory (now at Bournemouth) looking at how socio-biological cues such as eye gaze direction and pointing finger cues direct attention in an automatic way.

 

See here for more details of the project and earlier posts on my work with Nicola on socio-biological cueing here

 

“EyeLander” game for children with VI now available!

EyelandFor the last 2 years I have been working with the WESC Foundation in Exeter to develop a computer game to  improve vision in children and young people with partial visual loss. In EyeLander you play the role of a character (The “EyeLander”) who must escape from an island using her visual skills. You have to make your way through a series of challenges to escape the erupting volcano, including dodging lava, an angry cow and a giant laughing baby! The only way to get through them to your boat is to find various coloured target shapes hidden amongst distracting items on the screen.

Although based closely on visual search training that has been shown to be effective in adults with hemianopia (see earlier post), EyeLander is unique in that is has been developed in collaboration with children at WESC and social computing researchers from Lincoln’s Computer Science department, adopting a user centred design approach. We believe this will make visual search training more effective and fun for children and even adults with visual field loss. angry baby

We will be evaluating the effectiveness of the game over the next few months and are interested in hearing from you if you suffer from any form of partial visual loss and would be willing to take part in the evaluation. We are also seeking involvement from children in the South West with  normal vision to take part in the research by playing the game and being an EyeLander!

Please contact myself (tlhodgson@lincoln.ac.uk) or Jonathan Waddington (JWaddington@wescfoundation.ac.uk tel:01392 454200) if you would like to know more or take part in the research. 

eyelander_search