Show simple item record

dc.contributor.authorBuchan, Julie N.
dc.contributor.otherQueen's University (Kingston, Ont.). Theses (Queen's University (Kingston, Ont.))en
dc.date2011-09-30 23:31:07.754en
dc.date.accessioned2011-10-11T18:16:46Z
dc.date.available2011-10-11T18:16:46Z
dc.date.issued2011-10-11
dc.identifier.urihttp://hdl.handle.net/1974/6835
dc.descriptionThesis (Ph.D, Psychology) -- Queen's University, 2011-09-30 23:31:07.754en
dc.description.abstractMost events that we encounter in everyday life provide our different senses with correlated information, and audiovisual speech perception is a familiar instance of multisensory integration. Several approaches will be used to further examine the role of cognitive factors on audiovisual speech perception. The main focuses of this thesis will be to examine the influences of cognitive load and selective attention on audiovisual speech perception, as well as the integration of auditory and visual information in talking distractor faces. The influence of cognitive factors on the temporal integration of auditory and visual speech, and gaze behaviour during audiovisual speech will also be addressed. The overall results of the experiments presented here suggest that the integration of auditory and visual speech information is quite robust to various attempts to modulate the integration. Adding a cognitive load task shows minimal disruption of the integration of auditory and visual speech information. Changing attentional instructions to get subjects to selectively attend to either the auditory or visual speech information also has a rather modest influence on the observed integration of auditory and visual speech information. Generally, the integration of temporally offset auditory and visual information seems rather insensitive to cognitive load or selective attentional manipulations. The processing of visual information from distractor faces seems to be limited. The language of the visually articulating distractors doesn't appear to provide information that is helpful for matching together the auditory and visual speech streams. Audiovisual speech distractors are not really any more distracting than auditory distractor speech paired with a still image, suggesting a limited processing or integration of the visual and auditory distractor information. The gaze behaviour during audiovisual speech perception appears to be relatively unaffected by an increase in cognitive load, but is somewhat influenced by attentional instructions to selectively attend to the auditory and visual information. Additionally, both the congruency of the consonant, and the temporal offset of the auditory and visual stimuli have small but rather robust influences on gaze.en_US
dc.languageenen
dc.language.isoenen_US
dc.relation.ispartofseriesCanadian thesesen
dc.rightsThis publication is made available by the authority of the copyright owner solely for the purpose of private study and research and may not be copied or reproduced except as permitted by the copyright laws without written authority from the copyright owner.en
dc.subjectattentionen_US
dc.subjectcognitive loaden_US
dc.subjectaudiovisual distractorsen_US
dc.subjectaudiovisual speech perceptionen_US
dc.subjecttemporal integrationen_US
dc.subjectperceptionen_US
dc.subjectmultisensory integrationen_US
dc.subjectselective attentionen_US
dc.titleCognitive resources in audiovisual speech perceptionen_US
dc.typethesisen_US
dc.description.degreePh.Den
dc.contributor.supervisorMunhall, Kevin G.en
dc.contributor.departmentPsychologyen


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record