EPSRC logo

Details of Grant 

EPSRC Reference: EP/D009898/1
Title: The perceptual template governing saccadic decisions
Principal Investigator: Ludwig, Dr C
Other Investigators:
Researcher Co-Investigators:
Project Partners:
Department: Experimental Psychology
Organisation: University of Bristol
Scheme: Overseas Travel Grants Pre-FEC
Starts: 15 June 2005 Ends: 14 August 2005 Value (£): 5,903
EPSRC Research Topic Classifications:
Vision & Senses - ICT appl.
EPSRC Industrial Sector Classifications:
No relevance to Underpinning Sectors
Related Grants:
Panel History:  
Summary on Grant Application Form
Our eyes are frequently moving to obtain new information from our surroundings. These eye movements, called saccades, play a critical role in key behaviours such as reading, searching (looking for a pen on a cluttered desk), and orienting to new or surprising events (your cat jumping out from behind a bush). They determine what we can see, and are thus crucial to our everyday functioning and survival.Humans make about 3 or 4 of these saccades every second. Necessarily each of these movements is preceded by some decision as to where to look next. I am interested in how we make these kinds of decisions. Clearly what people look at is to a large extent determined by what they are looking for. For instance, when you are looking for your pen your brain must contain some kind of representation of what your pen looks like. As you scan around your desk this representation is matched to the visual information falling on your retina. If there is a good match between the input and the representation, you have found what you are looking for.In many of our environments what we are looking for is not as well defined as in the example above. Indeed, in many situations not only is there uncertainty as to what the target exactly looks like (e.g. looking for something to write with as opposed to my red ballpoint pen), but also the incoming visual signals are uncertain or noisy (e.g. objects might be partly obscured).In the proposed studies, we present human observers with a number of patches on a computer screen. At the same time, we record their eye movements using an eye tracker. This is a machine that uses cameras to record where people are looking at any particular point in time. The observers are asked to look for the brightest patch on the screen. We create uncertainty in what they are looking for by randomly varying the size of the patches. Even though, strictly speaking, size is an irrelevant dimension for the task at hand, uncertainty in this dimension means that observers cannot have an exact representation of what they are looking for. In addition, we introduce uncertainty in the environment by randomly altering the brightness of each pixel in the display. This random perturbation is referred to as visual noise. As a result of the noise, some of the dimmer items in the display may actually end up looking a bit brighter.By testing observers extensively over many trials, we can relate their eye movement decisions to the properties of the random visual noise. That is, we can analyse the noise of all the trials in which the observer looked at one of the dimmer patches as opposed to the brightest one. Using sophisticated statistical techniques we can identify the representation that observers used to match the incoming visual signals with. With these tools we can ask a number of important questions about this representation. For example, one of the questions that will be addressed is whether observers can hold multiple representations active at any one point.Characterising how the human brain makes saccadic decisions under conditions of uncertain visual input will improve our understanding of the factors that determine where we look. Knowing how people decide where to look is important in a number of settings including, for example, human-computer interaction and airflight control. In addition, this knowledge sheds light on how humans make (perceptual) decisions in general.
Key Findings
This information can now be found on Gateway to Research (GtR) http://gtr.rcuk.ac.uk
Potential use in non-academic contexts
This information can now be found on Gateway to Research (GtR) http://gtr.rcuk.ac.uk
Impacts
Description This information can now be found on Gateway to Research (GtR) http://gtr.rcuk.ac.uk
Summary
Date Materialised
Sectors submitted by the Researcher
This information can now be found on Gateway to Research (GtR) http://gtr.rcuk.ac.uk
Project URL:  
Further Information:  
Organisation Website: http://www.bris.ac.uk