Rombokas, EricOrsborn, AmyKarrenbach, Maxim Amon2021-10-292021-10-292021Karrenbach_washington_0250O_23420.pdfhttp://hdl.handle.net/1773/47876Thesis (Master's)--University of Washington, 2021Users of prosthetic limbs face many challenges operating their devices in everyday tasks. The many aspects of doing tasks like stair descent and object grasping are largely done subconsciously in a person without any limb impairment. A person with a limb amputation or impairment bears the full weight of this cognitive load, and are often forced to deal with the limitations of the prosthesis by employing compensation strategies, like "overhanging toe" in stair descent and shoulder compensations in grasping tasks. My research aims to ease the burden placed on prosthetic limb users by using the learning capabilities from data-driven methods of neural networks particularly in object grasping. Gaze2Grasp is a machine learning algorithm that uses eye-tracked gaze to predict preferable wrist rotations of a virtual prosthesis. By using an input of vision and providing outputs of wrist rotations, Gaze2Grasp could be implemented in any prosthetic controller with wrist degrees of freedom, and shows the value of vision in object grasping. The algorithm may also be useful for other applications like remote-operated robotic grasping systems.application/pdfen-USCC BYAssistive DeviceMachine LearningProsthesis ControlElectrical and computer engineeringComputer scienceGaze2Grasp: Vision-based system for pre-grasp prosthesis controlThesis