Surgical Scene Understanding Towards Human-Centered Collaboration in Robotic Surgery
| dc.contributor.advisor | Hannaford, Blake | |
| dc.contributor.author | Kalavakonda Chandrasekar, Niveditha | |
| dc.date.accessioned | 2025-10-02T16:08:27Z | |
| dc.date.issued | 2025-10-02 | |
| dc.date.submitted | 2025 | |
| dc.description | Thesis (Ph.D.)--University of Washington, 2025 | |
| dc.description.abstract | Robot-assisted minimally invasive surgery brings together the expertise of highly-skilled surgeons with the increased precision and dexterity of assistive robots. Surgical robots currently in deployment provide surgeons with enhanced visualization, filter out hand tremors, and incorporate data integration and analytics to facilitate improved patient care. Mostsurgical robots, however, are still constrained to being teleoperated by a surgeon, with limited advances towards integrating automation in the operating room. The use of automation in surgery has the potential to extend the current benefits of robotic surgery to integrate more consistent care by reducing variability. While studies have explored automating tasks performed by a surgeon, the role of the first assistant during surgeries has often been underappreciated in the context of automation. However, this role is pivotal in ensuring the smooth progression of procedures, maintaining a clear surgical field, and providing critical support to the primary surgeon. To this end, offloading some sub-tasks performed by the first assistant to automated systems can reduce the risk of human error and optimize re- source allocation. Building on the viability of camera sensors as a reliable input, the goal of this thesis was to aid the development of an autonomous suction assistance tool by developing the infrastructure for surgical scene understanding. This work developed surgical segmentation models for efficient segmentation of binary, parts and instrument segmentation, established approaches for addressing the paucity of labeled data in surgical settings, and developed a framework to benchmark approaches to cooperative autonomy through the lens of an assistive suction task | |
| dc.embargo.lift | 2026-10-02T16:08:27Z | |
| dc.embargo.terms | Delay release for 1 year -- then make Open Access | |
| dc.format.mimetype | application/pdf | |
| dc.identifier.other | KalavakondaChandrasekar_washington_0250E_28934.pdf | |
| dc.identifier.uri | https://hdl.handle.net/1773/53996 | |
| dc.language.iso | en_US | |
| dc.rights | CC BY-NC | |
| dc.subject | Robotics | |
| dc.subject.other | Electrical and computer engineering | |
| dc.title | Surgical Scene Understanding Towards Human-Centered Collaboration in Robotic Surgery | |
| dc.type | Thesis |
