Visual Attention and Paeleolithic Stone Tools

This month we published a study describing how people direct eye movements during viewing of ancient stone tools. The work was led by Maria Silva Gago, Emiliano Bruner and other researchers from CENIEH University of Burgos Spain together with myself and former Lincoln PhD student Flora Ioannidou (now at University of Aberdeen).

The research used eye tracking and the “mouse click” attention tracking technique to measure which areas of stone tools attracted peoples interest most. High resolution photographs of Paeleolithic hand axes (example right), along with much more ancient (approximate 2 million year old) roughly worked tools (example below) were shown to participants. We found that people’s attention was drawn to the “knapped” surfaces of  tools, as well as areas such as the base where the tool would normally be grasped. This was the case even though participants were just viewing pictures and not handling the objects and weren’t told the objects were tools.

We also ran the images through a computer model which calculated the”salience” of different regions of the images.  Some very influential models of vision suggest that attention is simply attracted to regions that stand out visually from the background (i.e. are more “salient”), but the computer model did not seem to explain our results very well. Instead, we think it was how the different parts of the tool might be grasped and used, rather than just how visually interesting it was, that was directing the viewers’ eyes.

The results support the theory of object-action affordances proposed by James J Gibson in the 1970s. He suggested that the brain very quickly detects features of objects that are relevant for action, priming a tendency for us to carry out the associated action. One possibility is that through evolution our ancestors’ brains became increasingly sophisticated at detecting action affordances in stone objects. This lead in turn to the manufacture of more sophisticated tools with enhanced features designed to activate action affordances. This in turn may have caused further development in the brain’s object-action affordance network in an ongoing process of coevolution between human cognitive capacities and tool complexity.

The full paper is published in the journal Perception and can be read online here.

Some example eye movement sequences recorded in the study are shown below.

Orienting to social cues in Parkinsons

This month we published a paper in the journal Experimental Brain Research reporting how eye movements are influenced by social cues in people with Parkinsons disease.

Previous studies have suggested that Parkinsons patients have problems in directing visual attention due to loss of the chemical dopamine within the basal ganglia of the brain, but no one has specifically looked at whether they have particular problems with social cues. This question is important as problems with processing social information may lead to difficulties in everyday life for people with Parkinsons. It is also a question of scientific interest as it has been suggested that we have specialised brain pathways for processing social information (the so-called “Social Brain hypothesis”).

We used a task in which pictures of someone else’s eyes, arrows and pointing fingers are shown on a computer screen, whilst people track a spot jumping around the screen with their eyes (see picture; you can see a video of the task here). Often people make an eye movement by mistake in the direction indicted by the eyes, arrows or fingers, even when they are meant to respond directly at the target (the little black spot) and ignore the pictures. We found healthy adults made more of these errors with pointing finger cues compared to the other cue types (suggesting pointing fingers provide a particularly strong cues to eye movements) whereas people with Parkinsons made similarly high errors for all 3 cue types.

Although not specifically better or worse for socially relevant cues (eyes / fingers), Parkinsons patients clearly had difficulty in suppressing their distracting influence. This suggests the basal ganglia plays a role in controlling eye movements in response to social and non-social cues. But this doesn’t mean that people with Parkinsons don’t have particular problems with maintaining attention in social situations. On the contrary, social interaction might often take place in a busy environment, with lots of sensory information competing for attention. These deficits in the control of eye movements might therefore lead to difficulty in social interaction and might impact on patients’ quality of life.

A link to the paper can be found here

Assessing premorbid IQ with picture vocabulary tests

An important task in neuropsychology is establishing how a patient’s current state compares to how they were before they suffered  brain injury. Assessing “premorbid” IQ is also of key importance in research studies that compare cognitive abilities in patients with a group of healthy participants, as it is important that the two comparison groups are as well matched as possible.

The most common way of assessing premorbid IQ to date is the National Adult Reading Test or “NART’. This short word reading test includes words with regular and irregular spellings and is highly correlated with IQ, whilst also “holding” after brain injury (being relatively unaffected relative to other tests of cognitive function). The problem with the NART is that it requires patients to read words out loud. Brain injury can also impair speech ability meaning that the test is impossible to use in patients with conditions such as aphasia.

We have recently published a paper in the journal Applied Neuropsychology: Adult which examined the effectiveness of the British Picture Vocabulary Scale (BPVS) as an assessment of premorbid ability. In this test patients’ must point to pictures matching the words spoken by the examiner. Originally developed to assess vocabulary in young children, we found that the BPVS appears to be at least as good as the NART at “holding” after brain injury and assessing premorbid IQ.

Along with other similar picture based tests, we think the BPVS could be a useful tool to assessing premorbid IQ in research as well as clinical neuropsychology practice.

Hemi-disconnection project at Great Ormond Street Hospital

This week I have been at Great Ormond Street Hospital London setting up some tasks on an Eyelink Duo eye tracker for a project led by Dr Luis Lacerda and Prof. Chris Clark with Vision Specialist Clinical Scientist Sian Hanley.

The project will examine children’s recovery of visual function following Hemisphere disconnection surgery (which can be used to treat severe epilepsy). Luis’ team will evaluate the effectiveness of our Eyelander game for training visual search ability in these children as well as UCL’s Read Right programme, developed by my old colleague and collaborator Prof. Alex Leff.

It’s been great to visit and a privilege to be involved in such an exciting project with such an outstanding team of researchers and clinicians. Feel free to get in touch with Luis if you would like to know more about the project.

Eyelander Goes Mobile

The Eyelander game for children with visual field loss is now compatible for use on mobile devices such as phones and tablets!

The game is based on visual search training that has been shown to be effective in improving functional visual abilities in adults with homonymous hemianopia. Our recent evaluation trial showed Eyelander delivered similar magnitude of improvement in functional visual abilities in children and young adults as the more boring adult training programmes. You can play the game and sign up for research to give us feedback on the game via the Eyelander website http://www.eyelander.co.uk. It is free to play and is designed to be colourful, fun and engaging for children. Players search for shapes on the screen which help their character to escape from a mysterious island.

We have been taking a step by step approach to making the game more widely available as we build the evidence base for its effectiveness, but we decided now was the time to make it more widely available for tablets and phones. It actually makes the game more fun to play using a touch screen rather than a mouse and cursor so I am really pleased with the results.

 

 

 

 

 

The game was developed in collaboration with The WESC Foundation Exeter, the School of Computer Science University of Lincoln and Mutant Labs Ltd Plymouth. See here for previous blog posts on the game development and evaluation: Eyelander game evaluation and Parkinsons and Spatial Memory studies published “EyeLander” game for children with VI now available! “Game-ifying” visual search training for children