Virtual Reality for Intracranial Brain Computer Interface Design and Human Neural Engineering

dc.contributor.advisorRao, Rajesh P.N.
dc.contributor.advisorOjemann, Jeffrey
dc.contributor.authorPaschall, Courtnie
dc.date.accessioned2023-01-21T05:01:25Z
dc.date.available2023-01-21T05:01:25Z
dc.date.issued2023-01-21
dc.date.submitted2022
dc.descriptionThesis (Ph.D.)--University of Washington, 2022
dc.description.abstractImplanted neural devices allow direct recording and stimulation of neural tissue. Immersive virtual reality (VR) offers a powerful new experimental venue, granting researchers unprecedented control over the visual and aural landscape to develop complex tasks with nuanced controls and otherwise impossible global variable access. Modern VR hardware also guarantees precise, continuous tracking in six dimensions (three linear and three rotational) of real-world objects, namely VR controllers, headset, and trackers. In my PhD research, I sought to leverage the advantages of VR to advance neuroscience research in a unique and valuable patient population. To do this, I built the first platform integrating immersive VR with awake, in-patient, invasive neural recording, aligning highly resolved neural signals to ongoing VR task variables and human behavior.1 I then expanded this initial platform to enable direct electrical stimulation (DES) of the brain in response to behaviors and events tracked in the virtual environment. This capability allowed me to explore novel approaches to neurohaptic feedback and demonstrate the first bidirectional brain computer interface in virtual reality (VR-BCI).2 By applying DES to the primary somatosensory cortex (S1), I enabled two human subjects to reach out, grasp, and feel a purely virtual object, and then to discriminate between visually identical virtual objects based only on their neurohaptic S1-DES profile. This represented the first time that humans used a neurohaptic cortical interface that was responsive to their volitional, active engagement to explore the feel of an immersive virtual environment. Then, incorporating online digital signals processing, I demonstrated a real-time bidirectional VR-BCI that enabled the first human “cerebronaut” (so coined by our research subject) to use gaze to select an object, a neural trigger to initiate animated grasp of that object,3 and S1-DES to feel the contact between animated virtual hand and virtual object.
dc.embargo.termsOpen Access
dc.format.mimetypeapplication/pdf
dc.identifier.otherPaschall_washington_0250E_24954.pdf
dc.identifier.urihttp://hdl.handle.net/1773/49603
dc.language.isoen_US
dc.relation.haspartVRBCI_Clip.mp4; video; .
dc.rightsCC BY-NC-ND
dc.subjectData science
dc.subjectIntracranial electrodes
dc.subjectNeural engineering
dc.subjectVirtual reality
dc.subjectBioengineering
dc.subjectNeurosciences
dc.subject.otherBioengineering
dc.titleVirtual Reality for Intracranial Brain Computer Interface Design and Human Neural Engineering
dc.typeThesis

Files

Original bundle

Now showing 1 - 2 of 2
Loading...
Thumbnail Image
Name:
Paschall_washington_0250E_24954.pdf
Size:
6.65 MB
Format:
Adobe Portable Document Format
Loading...
Thumbnail Image
Name:
VRBCI_Clip.mp4
Size:
17.17 MB
Format:
Unknown data format

Collections