Hemi-disconnection project at Great Ormond Street Hospital

This week I have been at Great Ormond Street Hospital London setting up some tasks on an Eyelink Duo eye tracker for a project led by Dr Luis Lacerda and Prof. Chris Clark with Vision Specialist Clinical Scientist Sian Hanley.

The project will examine children’s recovery of visual function following Hemisphere disconnection surgery (which can be used to treat severe epilepsy). Luis’ team will evaluate the effectiveness of our Eyelander game for training visual search ability in these children as well as UCL’s Read Right programme, developed by my old colleague and collaborator Prof. Alex Leff.

It’s been great to visit and a privilege to be involved in such an exciting project with such an outstanding team of researchers and clinicians. Feel free to get in touch with Luis if you would like to know more about the project.

Eyelander game evaluation and Parkinsons and Spatial Memory studies published

Research in patients both young and old can be difficult, time consuming and stressful to carry out (e.g. due to the ethical approval process, patient recruitment and practical difficulties in testing patients with physical disabilities etc). Yet the importance and potential benefits to patients themselves of such research far out weighs the difficulty entailed in conducting it.

Two of my recently published papers reflect the outcome of patient based projects. Both studies use tasks which require viewers to search through items on a screen using saccadic eye movements. The first addressed the issue of working memory and oculomotor control in Parkinsons disease, a topic I have been researching since the late 1990s. Whilst the second reports the clinical trial evaluating the effectiveness of the Eyelander video game for children who have had neurological injury leading to partial visual field loss (hemianopia).

In the first study, published in the April 2019 edition of the Journal of Cognitive Neuroscience we recorded eye movements while participants performed a version of the CANTAB Spatial Working Memory task which requires patients to search through boxes on a computer screen to find hidden tokens. I first had the idea to do this study whilst watching patients performing this task on a touch screen when I was a post-doctoral research fellow at Charing Cross Hospital, London. I could see that patients were using eye movements a lot in this token “foraging” task, but at the time we didnt have the technology to track their eye movements properly. It was only later that suitable eye tracking equipment and software became available to carry out the research. Amongst other findings the paper shows that people with Parkinsons don’t use eye movements to plan ahead or look back at locations they’ve already searched as effectively as controls, most likely due to an imbalance of the neurotransmitter dopamine in the prefrontal cerebral cortex.

 

The second paper, published in the December 2018 edition of Journal of Visual Impairment and Blindness describes the evaluation of our visual search  game for children with partial visual loss following brain injury affecting the visual parts of the cerebral cortex. The results showed children were able to play the game at home unsupervised and that it had a positive effect on parallel measures of functional visual ability which was similar in magnitude to effects reported for visual search training in adult with partial visual loss following stroke. The Eyelander game is now available for anyone to play online, so please take a look. We are also starting a collaborative project with Great Ormond Street Hospital to evaluate its effectiveness for treating visual field loss following neurosurgical procedures in children.

“Functional Pleomorphism” and the Neuroscience of Rules

Last month we published a paper in Frontiers of Human Neuroscience. This was the latest and last functional magnetic imaging (fMRI) study to come out of a Wellcome Trust funded project run by myself (whilst at University of Exeter) and Dr Ben Parris (now at Bournemouth), investigating the organisation and function of cognitive control processes in the human frontal cortex.pic2

Humans are perhaps uniquely able to represent information in an abstract way which allows them to generalise rules and knowledge across different situations and tasks. For example we might learn to get a reward by pressing a button on the left with our finger when we see a blue shape, and a button on the right when we see a yellow shape. But we can also generalise rules to different situations e.g. instead of pressing a left or right button,make an eye movement to the left or right when we see the right colour…or say the word “left” or “right”…or wiggle the toes on your left and right feet!

This problem might seem trivial at first, but it is actually very difficult to see how neurons in the brain could achieve this, when ultimately they just make connections (“synapses”) linking inputs (a sensory stimulus) with outputs (a specific muscle movement). Imagine a neuron that truly represented the abstract concept of Blue things =Leftness. The only way this clever “concept neuron” could make us contract just our left interosseous muscle (the muscle in your index finger), or just our lateral extraocular muscle would be to have synapses that change their configuration completely within a fraction of a second so that they just sent signals down the connections to the finger or the eye muscles. Otherwise the signals from the neuron would either do nothing or have to make our whole body (including our toes!) move left. The idea that cells can change their synapses very rapidly (<1 second) is called neural functional pleomorphism and we don’t think it happens in the human brain.

In our study we got participants to perform a Blue / Yellow colour – Left/Right response rule switching task in the brain scanner. Sometimes they had to make just an Eye movement and other times just a left/right button press. We found areas of the brain that responded just for Eyes and just for Hands as you might expect. But other areas in the prefrontal cerebral cortex didn’t care about the response and showed activity more related to the rules. But when we zoomed in on these same areas and analysed the pattern of activity at a finer scale during Hand and Eye “epochs”, there were significant differences between the two. So whats going on?

pic1

 

 

 

 

 

The answer, we think, is that there aren’t any “concept neurons” in your brain. Instead rules and concepts are embedded in the distributed pattern of activity across networks of neurons. Some of these individual cells might prefer eye movements to the right when we see a Blue Square, others may send signals to our left finger when we see a Blue Circle and others which make connections relating to all sorts of different versions and combinations of the general rule: blue things=left. When all the neurons representing the various different examples of the general rule work together, that is when we experience ourselves thinking about the general rule concept. But in order to actually do anything useful only the sub-set of neurons coding one specific “exemplar” of the rule is active.

pic3So there is no single place in our brain where task rules can be said to be represented and exist!

The actual paper is less philosophical but it is Open Access and you can download it here