Project Description

2014–2018
On-body interaction, which employs the user’s own body as an interactive surface, offers several advantages over existing touchscreen devices: always-available control, an expanded input space, and additional proprioceptive and tactile cues that support non-visual use. While past work has explored a variety of approaches such as wearable depth cameras, bio-acoustics, and infrared reflectance (IR) sensors, these systems do not instrument the gesturing finger, do not easily support multiple body locations, and have not been evaluated with visually impaired users (our target).

In our work, we have been exploring computer vision based approaches using a finger mounted camera. In particular, here we introduce TouchCam, a finger wearable to support location-specific, on-body interaction. TouchCam combines data from infrared sensors, inertial measurement units, and a small camera to classify body locations and gestures using supervised learning. We empirically evaluate TouchCam's performance through a series of offline experiments followed by a realtime interactive user study with 12 blind and visually impaired participants. In our offline experiments, we achieve high accuracy (>96%) at recognizing coarse-grained touch locations (e.g., palm, fingers) and location-specific gestures (e.g., tap on wrist, left swipe on thigh). The follow-up user study validated our real-time system and helped reveal tradeoffs between various on-body interface designs (e.g., accuracy, convenience, social acceptability). Our findings also highlight challenges to robust input sensing for visually impaired users and suggest directions for the design of future on-body interaction systems.

Publications

TouchCam: Realtime Recognition of Location-Specific On-Body Gestures to Support Users with Visual Impairments

Lee Stearns, Uran Oh, Leah Findlater, Jon E. Froehlich

ACM IMWUT December 2017

Investigating Microinteractions for People With Visual Impairments and the Potential Role of on-Body Interaction

Uran Oh, Lee Stearns, Alisha Pradhan, Jon E. Froehlich, Leah Findlater

Proceedings of ASSETS 2017 | Acceptance Rate: 26.2% (33 / 126)