Extending the capabilities of smartphone sensors for applications in interaction and health
MetadataShow full item record
Computing devices continually evolve and in a relatively short time, have morphed from a room-sized machine to something much smaller. Today, a mobile computer can be kept in our pocket, or worn as a wristwatch. These devices are sufficiently powerful for majority of tasks a typical user performs on a computer. While these devices are slowly catching up with the desktops in raw computation power, they already have much richer sensing capabilities. The on-device sensors provide mobile devices with an unprecedented opportunity to not only provide richer interactions, but also enrich the quality of life. Acknowledging that the mobile devices cannot be loaded with every possible sensor, I believe a big proportion of sensing responsibility will lie with the generic sensors on our devices, such as camera, microphone, motion sensors, touchscreen, etc. In this dissertation, I provide support for my thesis statement: "The generic sensors on mobile devices can be used as substitutes for dedicated sensors in interaction and health applications. In presence of noise and uncertainty, multiple generic sensors can contribute to enable robust and deployable user-facing sensing systems." In this thesis, I show: 1. How to extend the capabilities of the on-device sensors, 2. Some of the challenges the developer and the user might face while using these generic sensors, 3. Discuss how some of the challenges can be countered by combining multiple generic sensors, and 4. Provide some direction how these sensors should evolve in future. In terms of application area, there are many domains where the smartphone sensors can prove useful, but this thesis focuses on applications in two domains: interaction and health sensing.