Makeability Lab
  • Home
  • News
  • People
  • Projects
  • Publications
  • Lab Handbook
  • Apply
On-Body Interaction
On-Body Interaction
On-Body Interaction
On-Body Interaction

Quick Info

Project Date Oct. 17, 2013 - Oct. 11, 2018
Sponsors: DoD
Keywords: computer vision, on-body interaction

About

On-Body Interaction Examples

We have designed and explored three on-body interaction techniques, including: (a) location-independent taps and swipes that can be performed anywhere on the body, (b) location-specific input that allows users to directly access a specific set of applicat

On-body interaction, which employs the user’s own body as an interactive surface, offers several advantages over existing touchscreen devices: always-available control, an expanded input space, and additional proprioceptive and tactile cues that support non-visual use. While past work has explored a variety of approaches such as wearable depth cameras, bio-acoustics, and infrared reflectance (IR) sensors, these systems do not instrument the gesturing finger, do not easily support multiple body locations, and have not been evaluated with visually impaired users (our target). In our work, we have been exploring computer vision based approaches using a suite of sensors mounted on the finger. In early work, we established the feasibility of localizing small surface images of the skin on the user's body (ICPR 2016). Later, we built real-time prototypes and evaluated them with sighted and blind users (ASSETS'17 and IMWUT'17)

Publications

TouchCam: Realtime Recognition of Location-Specific On-Body Gestures to Support Users with Visual Impairments

Lee Stearns, Uran Oh, Leah Findlater, Jon E. Froehlich

ACM IMWUT December 2017

keywords: wearables, accessibility, computer vision, low-vision, blind users, on-body interaction, skin texture classification, gesture recognition

PDF | doi | Citation | HandSight • On-Body Interaction • TouchCam

Investigating Microinteractions for People With Visual Impairments and the Potential Role of on-Body Interaction

Uran Oh, Lee Stearns, Alisha Pradhan, Jon E. Froehlich, Leah Findlater

Proceedings of ASSETS 2017 | Acceptance Rate: 26.2% (33 / 126)

keywords: wearables, visual impairments, mobile, on-body interaction, microinteraction

PDF | doi | Citation | HandSight • On-Body Interaction • TouchCam

Localization of Skin Features on the Hand and Wrist From Small Image Patches

Lee Stearns, Uran Oh, Bridget Cheng, Leah Findlater, David Ross, Rama Chellappa, Jon E. Froehlich

Proceedings of ICPR 2016

keywords: handsight, computer vision, machine learning, on-body interaction, skin texture classification, biometrics, skin-based localization

PDF | doi | Citation | HandSight • On-Body Interaction

Talks

Touchcam: Realtime Recognition of Location-specific On-body Gestures to Support Users With Visual Impairments

Oct. 8, 2018 | UbiComp 2018

Singapore

Uran Oh

PDF | PPTX | HandSight | On-Body Interaction | TouchCam

We design, build, and evaluate interactive tools and techniques to address pressing societal challenges in accessibility, sustainability, education, and beyond.

Recent News

Jan. 30, 2023

Dhruv Jain Honored with Willian Chan Memorial Dissertation Award

Jan. 11, 2023

SciStarter + Project Sidewalk: MLK Jr. Day of Service

Dec. 16, 2022

Congrats Michael Duan for CRA Undergrad Award

Links

  •  >   Home
  •  >   People
  •  >   News
  •  >   Publications
  •  >   Talks
  •  >   Videos