Coordinating the eyes and hand in goal-directed movement sequences

Loading...
Thumbnail Image

Authors

Bowman, Miles

Date

2009-11-13T21:16:05Z

Type

thesis

Language

eng

Keyword

eye hand coordination , motor control

Research Projects

Organizational Units

Journal Issue

Alternative Title

Abstract

Coordinated gaze and hand movements predominate a number of our interactions in reachable space and yet few studies examine the potential contribution of tactile feedback in planning these actions. This thesis was designed to investigate eye and hand coordination during movement sequences when reaching out to interact with objects. We developed a virtual reality paradigm that allowed us to control visual, tactile, and in some cases, auditory feedback provided to participants. Participants reached and touched five objects in succession. We measured behaviour that resulted from removing one or more of the aforementioned sources of feedback – focusing on task accuracy, and the timing and dynamics of eye and hand movements. Our principle manipulations were to remove visual feedback of the hand, and/or to change the object response to contact. We also unexpectedly removed tactile feedback signaling contact. In Experiment 1, we examined gaze and hand movement timing relative to contact events. Gaze remained long enough to capture contact in central vision, but also followed a time course indicating that contact timing was predicted. In Experiment 2 we examined the influence of dynamic object consequences (i.e., motion). Gaze remained to monitor consequences that follow initial contact especially when the hand was invisible; with longer delays it became difficult to differentiate between predictive or reactive movements. In Experiment 3 we directly tested whether gaze would hold upon a site of action during prolonged manipulation. Here, gaze remained past contact time and instead its departure was associated with the completion of action. Our findings are congruent with the notion that visually guided reaches are controlled to facilitate directing the hand to viewed locations of action – without visual feedback of the hand accuracy diminished and hand approach changed across all experiments. However, we provide consistent evidence that gaze is also controlled to capture planned sensory consequences related to action at its viewed location. Monitoring these sites would facilitate comparing predicted sensory events with those that are actively measured and improve control throughout the movement sequence. Such a process also indicates the importance of considering tactile feedback when examining coordinated eye and hand movements.

Description

Thesis (Ph.D, Neuroscience Studies) -- Queen's University, 2009-11-13 16:12:30.086

Citation

Publisher

License

This publication is made available by the authority of the copyright owner solely for the purpose of private study and research and may not be copied or reproduced except as permitted by the copyright laws without written authority from the copyright owner.

Journal

Volume

Issue

PubMed ID

External DOI

ISSN

EISSN