Enhancing Demonstration-Based Recognition Tools with Interactive Declarative Guidance for Authoring Touch Gestures
The rise of ubiquitous touchscreen devices highlights the needs and opportunities for touch gesture interaction. However, creating touch gesture support remains difficult, and using a pre-packaged library for a fixed set of touch gestures limits expressiveness. This dissertation addresses this problem through a set of novel touch gesture recognition tools. It examines the implementation challenges that developers face in authoring touch gestures and discusses the design, development, and validation of three touch gesture recognition tools. Specifically, this dissertation presents: (1) Gesture Coder, a pure demonstration based tool for authoring multi-touch gestures, (2) Gesture Studio, a tool for authoring multi-touch gestures that enhances a demonstration-based approach with declarative composition through a timeline visualization, and (3) Gesture Script, a tool for authoring symbolic gestures that enhances a demonstration-based approach by allowing explicit declarative specification of gesture structures through rendering scripts. These systems illustrate new strategies in designing support for interactive declarative guidance to enhance demonstration-based recognition tools. These systems demonstrate the thesis of this dissertation that enhancing demonstration-based recognition tools with interactive declarative guidance can provide developers with new forms of control over their learning systems, thereby preserving the low threshold of demonstration-based systems while raising the ceiling on their capabilities.