• Login
    View Item 
    •   Home
    • Graduate Theses, Dissertations and Projects
    • Queen's Graduate Theses and Dissertations
    • View Item
    •   Home
    • Graduate Theses, Dissertations and Projects
    • Queen's Graduate Theses and Dissertations
    • View Item
    JavaScript is disabled for your browser. Some features of this site may not work without it.

    An Augmented Reality Haptic Training Simulator for Spinal Needle Procedures

    Thumbnail
    View/Open
    Sutherland_Colin_J_201111_MASC.pdf (6.162Mb)
    Date
    2011-11-29
    Author
    Sutherland, Colin James
    Metadata
    Show full item record
    Abstract
    Medical simulators have become commonly used to teach new procedures to medical students and clinicians. Their accessibility allows trainees to perform training whenever they desire, and their flexibility allows for various patient body types and conditions to be simulated. This is in contrast to {\it in-vivo} training, which requires direct supervision from a trained clinician, and access to a live patient or cadaver, both of which have restrictions.

    This thesis proposes a novel prototype system for spinal anesthesia procedures which combines the use of both a haptic device for virtual, ultrasound-guided (US) needle simulations, and a physical mannequin registered to a patient specific computed tomography (CT) volume in order to create an augmented reality (AR) overlay. The mannequin will provide the user with a greater sense of spatial awareness that is not present in a purely virtual simulation, as well as providing physical visual clues to navigate the patient. Another novel aspect is the simulation of US images from CT images deformed via a finite element model (FEM).

    The system is composed of a torso mannequin from Sawbones Inc., a MicronTracker2 optical tracking system from Claron Technology, a Sensable PHANToM Premium 1.5A haptic device and a graphical user interface (GUI) to display relevant visual feedback. The GUI allows the user to view the AR overlaid on the video feed, and the CT slice and simulated US image based the position/orientation of a dummy US probe.

    Forces during the insertion are created via the FEM and sent to the haptic device. These forces include force from needle tip insertion, friction along the length of the needle inside the body, and from displacing the needle off its original insertion axis. Input to the system consists of a patient CT volume.

    The system is able to create forces that closely match those reported in the literature. A user study consisting of subjects with expertise ranging from familiarity with medical imaging to clinical experience with needle insertion procedures, was performed to qualitatively analyze the performance of the system. Three experienced physicians were also consulted for input and improvements. The feedback received from the questionnaire, and comments from the subjects and physicians, showed the system is able to simulate a real needle insertion quite well, and the graphical aids added were helpful during the training procedure.
    URI for this record
    http://hdl.handle.net/1974/6889
    Collections
    • Queen's Graduate Theses and Dissertations
    • Department of Electrical and Computer Engineering Graduate Theses
    Request an alternative format
    If you require this document in an alternate, accessible format, please contact the Queen's Adaptive Technology Centre

    DSpace software copyright © 2002-2015  DuraSpace
    Contact Us
    Theme by 
    Atmire NV
     

     

    Browse

    All of QSpaceCommunities & CollectionsPublished DatesAuthorsTitlesSubjectsTypesThis CollectionPublished DatesAuthorsTitlesSubjectsTypes

    My Account

    LoginRegister

    Statistics

    View Usage StatisticsView Google Analytics Statistics

    DSpace software copyright © 2002-2015  DuraSpace
    Contact Us
    Theme by 
    Atmire NV