On-Body Interaction
Project Description
On-body interaction, which employs the user’s own body as an interactive surface, offers several advantages over existing touchscreen devices: always-available control, an expanded input space, and additional proprioceptive and tactile cues that support non-visual use. While past work has explored a variety of approaches such as wearable depth cameras, bio-acoustics, and infrared reflectance (IR) sensors, these systems do not instrument the gesturing finger, do not easily support multiple body locations, and have not been evaluated with visually impaired users (our target). In our work, we have been exploring computer vision based approaches using a suite of sensors mounted on the finger. In early work, we established the feasibility of localizing small surface images of the skin on the user's body (ICPR 2016). Later, we built real-time prototypes and evaluated them with sighted and blind users (ASSETS'17 and IMWUT'17)
Publications
Investigating Microinteractions for People With Visual Impairments and the Potential Role of on-Body Interaction
Proceedings of ASSETS 2017 | Acceptance Rate: 26.2% (33 / 126)
Localization of Skin Features on the Hand and Wrist From Small Image Patches
Proceedings of ICPR 2016
Talks
Touchcam: Realtime Recognition of Location-specific On-body Gestures to Support Users With Visual Impairments
Oct 08, 2018 | UbiComp 2018
Singapore