Makeability Lab
  • Home
  • News
  • People
  • Projects
  • Publications
  • Lab Handbook
  • Apply
TouchCam
TouchCam
TouchCam

Quick Info

Project Date Sept. 23, 2014 - Oct. 11, 2018
Sponsors: DoD
Keywords: accessibility, computer vision, blind, low-vision, on-body interaction

About

TouchCam uses sensor fusion and computer vision to support location-specific, on-body interaction

One of our blind participants using TouchCam

On-body interaction, which employs the user’s own body as an interactive surface, offers several advantages over existing touchscreen devices: always-available control, an expanded input space, and additional proprioceptive and tactile cues that support non-visual use. While past work has explored a variety of approaches such as wearable depth cameras, bio-acoustics, and infrared reflectance (IR) sensors, these systems do not instrument the gesturing finger, do not easily support multiple body locations, and have not been evaluated with visually impaired users (our target).

In our work, we have been exploring computer vision based approaches using a finger mounted camera. In particular, here we introduce TouchCam, a finger wearable to support location-specific, on-body interaction. TouchCam combines data from infrared sensors, inertial measurement units, and a small camera to classify body locations and gestures using supervised learning. We empirically evaluate TouchCam's performance through a series of offline experiments followed by a realtime interactive user study with 12 blind and visually impaired participants. In our offline experiments, we achieve high accuracy (>96%) at recognizing coarse-grained touch locations (e.g., palm, fingers) and location-specific gestures (e.g., tap on wrist, left swipe on thigh). The follow-up user study validated our real-time system and helped reveal tradeoffs between various on-body interface designs (e.g., accuracy, convenience, social acceptability). Our findings also highlight challenges to robust input sensing for visually impaired users and suggest directions for the design of future on-body interaction systems.

Publications

TouchCam: Realtime Recognition of Location-Specific On-Body Gestures to Support Users with Visual Impairments

Lee Stearns, Uran Oh, Leah Findlater, Jon E. Froehlich

ACM IMWUT December 2017

keywords: wearables, accessibility, computer vision, low-vision, blind users, on-body interaction, skin texture classification, gesture recognition

PDF | doi | Citation | HandSight • On-Body Interaction • TouchCam

Investigating Microinteractions for People With Visual Impairments and the Potential Role of on-Body Interaction

Uran Oh, Lee Stearns, Alisha Pradhan, Jon E. Froehlich, Leah Findlater

Proceedings of ASSETS 2017 | Acceptance Rate: 26.2% (33 / 126)

keywords: wearables, visual impairments, mobile, on-body interaction, microinteraction

PDF | doi | Citation | HandSight • On-Body Interaction • TouchCam

Talks

Touchcam: Realtime Recognition of Location-specific On-body Gestures to Support Users With Visual Impairments

Oct. 8, 2018 | UbiComp 2018

Singapore

Uran Oh

PDF | PPTX | HandSight | On-Body Interaction | TouchCam

We design, build, and evaluate interactive tools and techniques to address pressing societal challenges in accessibility, sustainability, education, and beyond.

Recent News

March 04, 2023

Chu Li Presents Project Sidewalk to World IA Day

March 03, 2023

Makeability Lab Members at the Annual Allen School Ski Day

March 02, 2023

Congrats to Stefania Druga on Passing the PhD Defense!

Links

  •  >   Home
  •  >   People
  •  >   News
  •  >   Publications
  •  >   Talks
  •  >   Videos