Coordinating the eyes and hand in goal-directed movement sequences

dc.contributor.authorBowman, Milesen
dc.contributor.departmentNeuroscience Studiesen
dc.contributor.supervisorFlanagan, J. Randallen
dc.date2009-11-13 15:29:46.771
dc.date2009-11-13 16:12:30.086
dc.date.accessioned2009-11-13T21:16:05Z
dc.date.available2009-11-13T21:16:05Z
dc.date.issued2009-11-13T21:16:05Z
dc.degree.grantorQueen's University at Kingstonen
dc.descriptionThesis (Ph.D, Neuroscience Studies) -- Queen's University, 2009-11-13 16:12:30.086en
dc.description.abstractCoordinated gaze and hand movements predominate a number of our interactions in reachable space and yet few studies examine the potential contribution of tactile feedback in planning these actions. This thesis was designed to investigate eye and hand coordination during movement sequences when reaching out to interact with objects. We developed a virtual reality paradigm that allowed us to control visual, tactile, and in some cases, auditory feedback provided to participants. Participants reached and touched five objects in succession. We measured behaviour that resulted from removing one or more of the aforementioned sources of feedback – focusing on task accuracy, and the timing and dynamics of eye and hand movements. Our principle manipulations were to remove visual feedback of the hand, and/or to change the object response to contact. We also unexpectedly removed tactile feedback signaling contact. In Experiment 1, we examined gaze and hand movement timing relative to contact events. Gaze remained long enough to capture contact in central vision, but also followed a time course indicating that contact timing was predicted. In Experiment 2 we examined the influence of dynamic object consequences (i.e., motion). Gaze remained to monitor consequences that follow initial contact especially when the hand was invisible; with longer delays it became difficult to differentiate between predictive or reactive movements. In Experiment 3 we directly tested whether gaze would hold upon a site of action during prolonged manipulation. Here, gaze remained past contact time and instead its departure was associated with the completion of action. Our findings are congruent with the notion that visually guided reaches are controlled to facilitate directing the hand to viewed locations of action – without visual feedback of the hand accuracy diminished and hand approach changed across all experiments. However, we provide consistent evidence that gaze is also controlled to capture planned sensory consequences related to action at its viewed location. Monitoring these sites would facilitate comparing predicted sensory events with those that are actively measured and improve control throughout the movement sequence. Such a process also indicates the importance of considering tactile feedback when examining coordinated eye and hand movements.en
dc.description.degreePhDen
dc.format.extent2072540 bytes
dc.format.mimetypeapplication/pdf
dc.identifier.urihttp://hdl.handle.net/1974/5317
dc.language.isoengen
dc.relation.ispartofseriesCanadian thesesen
dc.rightsThis publication is made available by the authority of the copyright owner solely for the purpose of private study and research and may not be copied or reproduced except as permitted by the copyright laws without written authority from the copyright owner.en
dc.subjecteye hand coordinationen
dc.subjectmotor controlen
dc.titleCoordinating the eyes and hand in goal-directed movement sequencesen
dc.typethesisen
Files
Original bundle
Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
Bowman_Miles_C_200911_PhD.pdf
Size:
1.98 MB
Format:
Adobe Portable Document Format