Congrats Ladan Najafizadeh for Passing Her MS Thesis Defense!

Jon E. Froehlich Dec 02, 2016

Congrats to Ladan Najafizadeh for passing her MS defense entitled "Temporal Tracking Urban Areas Using Google Street View." The abstract for her MS thesis is below.

Tracking the evolution of built environments is a challenging problem in computer vision due to the intrinsic complexity of urban scenes, as well as the dearth of temporal visual information from urban areas. Emerging technologies such as street view cars, provide massive amounts of high quality imagery data of urban environments at street-level (e.g., sidewalks, buildings, and aesthetics of streets). Such datasets are consistent with respect to space and time; hence, they could be a potential source for exploring the temporal changes transpiring in built environments. However, using street view images to detect temporal changes in urban scenes induces new challenges such as variation in illumination, camera pose, and appearance/disappearance of objects. In this thesis, we leverage Google Street View’s new feature, “time machine”, to track and label the temporal changes of built environments, specifically accessibility features (e.g., existence of curb-ramps, condition of sidewalks). The main contributions of this thesis are: (i) initial proof-of-concept automated method for tracking accessibility features through panorama images across time, (ii) a framework for processing and analyzing time series panoramas at scale, (iii) set of proposed new HCI labeling tools that leverage the time series properties, and (iv) a geo-temporal dataset including different types of accessibility features for the task of detection. [read more]

Congratulations to Ladan Najafizadeh for Passing her MS Thesis Defense!

Jon E. Froehlich Dec 02, 2016

Congratulations to Ladan Najafixadeh for passing her MS thesis defense today on 'Temporal Tracking Urban Areas using Google Street View.' This work extends Project Sidewalk by using Google Street View's time machine feature to look back at historical images of locations and semi-automatically track accessibility features over time. [read more]

Congratulations to Lee Stearns for Passing His PhD Proposal Today

Jon E. Froehlich Nov 17, 2016

Congratulations to CS PhD student Lee Stearns for passing his PhD proposal today entitled "HandSight: A Touch-Based Wearable System to Increase Information Accessibility for People with Visual Impairments." This is exciting and important work that has recently received media attention from New Scientist, PC Magazine, and other venues. We look forward to seeing the proposed work come to fruition! [read more]

Project Sidewalk featured in the local news!

Manaswi Saha Nov 17, 2016

Project Sidewalk was covered by a DC TV station, WUSA9. Check out the video coverage and the news report now! This is exciting! [read more]

HandSight Featured in PC Magazine

Lee Stearns Nov 14, 2016

The HandSight project was recently featured in PC Magazine. The article described our work in using a finger-mounted camera to read printed text.

A group of scientists at the University of Maryland have come up with a novel solution to the problem of allowing the visually impaired to read.

The team, led by assistant professor of computer science Jon Froehlich, developed a device that allows blind people to read text without the aid of braille, which isn't always available.

PC Magazine: Fingertip Camera Reads to the Blind
[read more]

Project Sidewalk Presented at the GroupSight Workshop

Kotaro Hara Nov 10, 2016

Kotaro Hara gave an invited presentation about the Project Sidewalk at GroupSight workshop, held in Austin Texas during the HCOMP conference this year. In the presentation, Kotaro introduced our work on locating sidewalk accessibility features by combining crowdsourcing, computer vision, machine learning, and Google Street View. [read more]

HandSight Featured in New Scientist

Lee Stearns Nov 09, 2016

The HandSight project was recently featured in New Scientist, a British science and technology magazine. The article described our work in using a finger-mounted camera to read printed text.

New Scientist: Tiny fingertip camera helps blind people read without braille
[read more]

Accessibility Research Team at the Diversity in Computing Summit

Manaswi Saha Nov 08, 2016

We were honored to participate in the inaugural Diversity in Computing Summit. Makeability Lab and Inclusive Design Lab members presented a joint session on their accessibility research. The session, entitled "Interactive Computational Tools for Accessibility," covered work on Project Sidewalk (by Manaswi Saha), Temporal Tracking of Accessibility Features in Cities (by Ladan Najafizadeh) , Health and Fitness for the Mobility Impaired (by Meethu Malu), Accessible On-body Interaction for the Visually Impaired (by Uran Oh) and HandSight (by Lee Stearns). Talk slides are available here. [read more]

Technica: Tech+Design talk

Majeed Kazemitabaar Nov 03, 2016

We were happy to be invited to speak at the Tech+Design workshop at Technica, the largest all-women hackathon in the US. Four Makeability Lab members spoke (Soheil, Manaswi, Liang, and Majeed) who talked about their research, their design process, and designing for social impact. [read more]

HandSight Presentation at ASSETS 2016

Lee Stearns Oct 26, 2016

Lee Stearns presented the HandSight team's work at the ASSETS 2016 conference, held in Reno, Nevada this year. The presentation covered our recently published TACCESS journal article titled "Evaluating Haptic And Auditory Directional Guidance To Assist Blind People In Reading Printed Text Using Finger-Mounted Cameras." [read more]