EPSRC Reference: |
EP/D002281/1 |
Title: |
The information used to perceive binocular motion in depth |
Principal Investigator: |
Harris, Professor J |
Other Investigators: |
|
Researcher Co-Investigators: |
|
Project Partners: |
|
Department: |
Psychology |
Organisation: |
University of St Andrews |
Scheme: |
Standard Research (Pre-FEC) |
Starts: |
01 April 2006 |
Ends: |
30 September 2009 |
Value (£): |
257,960
|
EPSRC Research Topic Classifications: |
Biomedical neuroscience |
Vision & Senses - ICT appl. |
|
EPSRC Industrial Sector Classifications: |
|
Related Grants: |
|
Panel History: |
|
Summary on Grant Application Form |
Because our eyes are positioned side by side in our heads, there is a large portion of our visual field that can be seen through both eyes at the same time. But each eye gets a slightly different view of the world, because it is in a slightly different location, and hence each eye's image is slightly different. The small differences between the eyes are called binocular disparities and it been known for about 150 years that our brains use disparity to help us see depth and shape in the world. In fact, our brains are exquisitely sensitive to disparity: we can detect depth differences as small as the thickness of a sheet of paper at twice arms reach. Binocular disparity is also potentially useful for perceiving how objects move in depth. During object motion in three dimensions (3-D), the binocular disparity of the object will change. However, there will also be direct differences in the size of motion signals in each eye. For example, if an object move directly towards the nose, that object's image will move in one direction in one eye, and the opposite direction in the other eye. If we want to understand how the brain sees 3-D object motion, we must first find out whether it uses binocular disparity, motion differences between the eyes, or both. Such knowledge is critically important for furthering our understanding of basic brain processes. It also has potential application to the enhancement of virtual reality (VR) technology: VR systems can be designed to exploit what the visual system is most sensitive to.Over the last 10 years, several research groups have attempted to test what visual information is used by the brain, using simple visual tests that attempt to isolate the two sources of information. The results have been controversial. One problem is that complex tricks have to be employed to eliminate one of the sources of information (in the real world, both are always present). These manipulations result in visual stimuli that are noisy and that may not have completely eliminated all the expected information. In this project we propose to not just test human vision with such stimuli. Additionally, we will design models called 'ideal observers'. These are mathematical models designed to use all the information available in a stimulus. For example an ideal observer for detecting changing disparity will use all disparity information within a visual stimulus. Indeed, applying the model to a stimulus is a way of testing what information that stimulus contains. By using these models, we can design visual stimuli that contain both sources of information and calculate how well an ideal observer could use the information to see 3-D motion. We then compare human performance with the model, rather than simply testing whether people can see the motion or not. For the first time, we will be able to test the relative importance of binocular disparity and motion information for seeing 3-D object motion.
|
Key Findings |
This information can now be found on Gateway to Research (GtR) http://gtr.rcuk.ac.uk
|
Potential use in non-academic contexts |
This information can now be found on Gateway to Research (GtR) http://gtr.rcuk.ac.uk
|
Impacts |
Description |
This information can now be found on Gateway to Research (GtR) http://gtr.rcuk.ac.uk |
Summary |
|
Date Materialised |
|
|
Sectors submitted by the Researcher |
This information can now be found on Gateway to Research (GtR) http://gtr.rcuk.ac.uk
|
Project URL: |
|
Further Information: |
|
Organisation Website: |
http://www.st-and.ac.uk |