Project Description

2013–2018
On-body interaction, which employs the user’s own body as an interactive surface, offers several advantages over existing touchscreen devices: always-available control, an expanded input space, and additional proprioceptive and tactile cues that support non-visual use. While past work has explored a variety of approaches such as wearable depth cameras, bio-acoustics, and infrared reflectance (IR) sensors, these systems do not instrument the gesturing finger, do not easily support multiple body locations, and have not been evaluated with visually impaired users (our target). In our work, we have been exploring computer vision based approaches using a suite of sensors mounted on the finger. In early work, we established the feasibility of localizing small surface images of the skin on the user's body (ICPR 2016). Later, we built real-time prototypes and evaluated them with sighted and blind users (ASSETS'17 and IMWUT'17)

Publications

TouchCam: Realtime Recognition of Location-Specific On-Body Gestures to Support Users with Visual Impairments

Lee Stearns, Uran Oh, Leah Findlater, Jon E. Froehlich

ACM IMWUT December 2017

Investigating Microinteractions for People With Visual Impairments and the Potential Role of on-Body Interaction

Uran Oh, Lee Stearns, Alisha Pradhan, Jon E. Froehlich, Leah Findlater

Proceedings of ASSETS 2017 | Acceptance Rate: 26.2% (33 / 126)

Localization of Skin Features on the Hand and Wrist From Small Image Patches

Lee Stearns, Uran Oh, Bridget Cheng, Leah Findlater, David Ross, Rama Chellappa, Jon E. Froehlich

Proceedings of ICPR 2016