We have a paper published this month in the journal Experimental Brain Research based upon 2 years of data collected at the Lincoln Summer Scientist event which looked at how children’s saccadic eye movements are affected by directional socio-biological cues (see previous posts here and here).
We report results from 137 children who performed a pro-saccade task presented as a computer game in which they had to keep their eyes on a cartoon bee that jumped unpredictably from the middle to the left or right of a computer screen. The Eyelink II system was used to examine how quickly and accurately the children followed the bee while pictures of arrows and photos of pointing hands and eyes appeared in middle the screen just before the “buzy bee” character moved (see Youtube video).
We found that children were distracted by the direction of the pointing pictures such that their eyes were quicker to move towards the cartoon bee when she jumped in the same (Congruent) as opposed to the opposite (Incongruent) direction to the pointing finger, eye or arrow. Interestingly for the youngest group of children (3-5 years) this effect was found most strongly for pointing fingers. Only older children showed the effect for eyes and arrows. The paper makes the case for the view that children have to learn to link what they see in the world around them with the direction of interesting information and events. One of the first “cues” to attention that young children learn may be the direction of an adult’s pointing index finger.
Another interesting finding of the work was that for the youngest group of children when eye gaze cues overlapped with the onset of the peripheral target bee a large proportion of “omission errors” were made such that they missed the target completely and didn’t make a saccade. I was involved in the testing myself at Summer Scientist and I found this feature of young children’s behaviour particularly fascinating. It seems strikingly similar to stimulus “extinction” and neglect seen in adult stroke patients. Rather than just not moving their eyes I think 3-4 year olds didn’t “see” the Bee under these conditions and this is something I’d hope to follow up at future summer Scientist weeks.
The paper is full open access and available here . See here for another recent Experimental Brain Research study I’ve only recently seen by a group in Oslo showing ERP response to finger pointing cues in babies.
Last month we published a paper in Frontiers of Human Neuroscience. This was the latest and last functional magnetic imaging (fMRI) study to come out of a Wellcome Trust funded project run by myself (whilst at University of Exeter) and Dr Ben Parris (now at Bournemouth), investigating the organisation and function of cognitive control processes in the human frontal cortex.
Humans are perhaps uniquely able to represent information in an abstract way which allows them to generalise rules and knowledge across different situations and tasks. For example we might learn to get a reward by pressing a button on the left with our finger when we see a blue shape, and a button on the right when we see a yellow shape. But we can also generalise rules to different situations e.g. instead of pressing a left or right button,make an eye movement to the left or right when we see the right colour…or say the word “left” or “right”…or wiggle the toes on your left and right feet!
This problem might seem trivial at first, but it is actually very difficult to see how neurons in the brain could achieve this, when ultimately they just make connections (“synapses”) linking inputs (a sensory stimulus) with outputs (a specific muscle movement). Imagine a neuron that truly represented the abstract concept of Blue things =Leftness. The only way this clever “concept neuron” could make us contract just our left interosseous muscle (the muscle in your index finger), or just our lateral extraocular muscle would be to have synapses that change their configuration completely within a fraction of a second so that they just sent signals down the connections to the finger or the eye muscles. Otherwise the signals from the neuron would either do nothing or have to make our whole body (including our toes!) move left. The idea that cells can change their synapses very rapidly (<1 second) is called neural functional pleomorphism and we don’t think it happens in the human brain.
In our study we got participants to perform a Blue / Yellow colour – Left/Right response rule switching task in the brain scanner. Sometimes they had to make just an Eye movement and other times just a left/right button press. We found areas of the brain that responded just for Eyes and just for Hands as you might expect. But other areas in the prefrontal cerebral cortex didn’t care about the response and showed activity more related to the rules. But when we zoomed in on these same areas and analysed the pattern of activity at a finer scale during Hand and Eye “epochs”, there were significant differences between the two. So whats going on?
The answer, we think, is that there aren’t any “concept neurons” in your brain. Instead rules and concepts are embedded in the distributed pattern of activity across networks of neurons. Some of these individual cells might prefer eye movements to the right when we see a Blue Square, others may send signals to our left finger when we see a Blue Circle and others which make connections relating to all sorts of different versions and combinations of the general rule: blue things=left. When all the neurons representing the various different examples of the general rule work together, that is when we experience ourselves thinking about the general rule concept. But in order to actually do anything useful only the sub-set of neurons coding one specific “exemplar” of the rule is active.
So there is no single place in our brain where task rules can be said to be represented and exist!
The actual paper is less philosophical but it is Open Access and you can download it here
In a couple of weeks time I’m aiming to complete the David Lloyd Lincoln Sprint Triathlon in benefit of Parkinsons UK. You can sponsor me here
Much of my published research over the years has looked at how eye movements are affected in Parkinsons during cognitively demanding tasks such as problem solving, rule learning and task switching (see earlier post). In the long term its possible my research could help to develop tests to improve earlier detection of the condition. But what research papers can’t get across is what amazingly nice people Parkinsons patients are and how positive they are about helping with research.
Because of this I wanted to make at least a token effort to raise awareness and provide a direct benefit by doing my Triathlon in aid of Parkinsons UK.
We currently have a fully funded studentship opportunity available to examine how we direct attention to social cues in the real world. Supervised by Dr Frouke Hermens and myself the project will involve using a mobile eye tracking system. The work builds on earlier work by Frouke, myself and my former PhD student Nicola Gregory (now at Bournemouth) looking at how socio-biological cues such as eye gaze direction and pointing finger cues direct attention in an automatic way.
See here for more details of the project and earlier posts on my work with Nicola on socio-biological cueing here