HandSight
HandSight augments the sense of touch in order to help people with visual impairments more easily access the physical and digital information they encounter throughout their daily lives. It is still in an early stage, but the envisioned system will consist of tiny CMOS cameras and micro-haptic actuators mounted on one or more fingers, computer vision and machine learning algorithms to support fingertip-based sensing, and a smartwatch for processing, power, and speech output. Potential use-cases include reading or exploring the layout of a newspaper article or other physical document, identifying colors and visual textures when getting dressed in the morning, or even performing taps or gestures on the palm or other surfaces to control a mobile phone.
TouchCam: Realtime Recognition of Location-Specific On-Body Gestures to Support Users with Visual Impairments
Lee Stearns,
Uran Oh,
Leah Findlater,
Jon E. Froehlich
ACM IMWUT December 2017
Keywords:
wearables,
accessibility,
computer vision,
low-vision,
blind users,
on-body interaction,
skin texture classification,
gesture recognition
PDF
doi
Cite
HandSight
On-Body Interaction
TouchCam
Evaluating Wrist-Based Haptic Feedback for Non-Visual Target Finding and Path Tracing on a 2D Surface
Jonggi Hong,
Alisha Pradhan,
Jon E. Froehlich,
Leah Findlater
Proceedings of ASSETS 2017
|
Acceptance Rate: 26.2% (33 / 126)
Keywords:
wearables,
haptics,
accessibility,
blind,
blind users,
haptic feedback,
haptic wristband,
directional guidance
PDF
doi
Cite
Touchscreen Accessibility
HandSight
Haptic Hand Guidance
Localization of Skin Features on the Hand and Wrist From Small Image Patches
Lee Stearns,
Uran Oh,
Bridget Cheng,
Leah Findlater,
David Ross,
Rama Chellappa,
Jon E. Froehlich
Proceedings of ICPR 2016
Keywords:
handsight,
computer vision,
machine learning,
on-body interaction,
skin texture classification,
biometrics,
skin-based localization
PDF
doi
Cite
HandSight
On-Body Interaction
Evaluating Haptic and Auditory Directional Guidance to Assist Blind People in Reading Printed Text Using Finger-Mounted Cameras
Lee Stearns,
Ruofei Du,
Uran Oh,
Catherine Jou,
Leah Findlater,
David Ross,
Jon E. Froehlich
ACM Transactions on Accessible Computing (TACCESS) 2016
Keywords:
wearables,
handsight,
accessibility,
visual impairments,
real-time ocr,
blind reading
PDF
doi
Cite
HandSight
Haptic Hand Guidance
Supporting Everyday Activities for Persons with Visual Impairments Through Computer Vision-Augmented Touch
Leah Findlater,
Lee Stearns,
Ruofei Du,
Uran Oh,
David Ross,
Rama Chellappa,
Jon E. Froehlich
Extended Abstract Proceedings of ASSETS 2015
Keywords:
computer vision,
blind,
visual impairments,
finger camera,
touch vision
PDF
doi
Cite
HandSight
Oct 08, 2018 | UbiComp 2018
Singapore
Uran Oh
Aug 01, 2018 | PhD Defense, Computer Science
University of Maryland, College Park
Lee Stearns
Jun 07, 2017 | UMD CS Staff Talk
University of Maryland, College Park
Jon E. Froehlich
Apr 13, 2017 | HCDE Invited Talk
University of Washington, Seattle
Jon E. Froehlich
Apr 11, 2017 | UW CSE Colloquium
University of Washington, Seattle
Jon E. Froehlich
Apr 06, 2017 | Lecture Series at the Laboratory for Telecommunication Sciences
LTS Auditorium, College Park, MD
Jon E. Froehlich
Jun 01, 2016 | Proceedings of GI 2016
Victoria, British Columbia, CA
Jonggi Hong