Two new smartphone apps developed by UC Santa Cruz professor Roberto Manduchi aim to help blind individuals navigate indoor spaces safely. These apps provide spoken directions, indoor wayfinding, navigation, and safe return without the need to hold the phone in front for accessibility. The technology uses the phone’s sensors to track movement and location, providing audio cues and directions. The apps integrate features for backtracking, using inertial sensors, accelerometers, and gyroscopes. Manduchi’s research focuses on scalable, safer technology that can aid visually impaired individuals in navigating new spaces independently. Future developments include AI integration for scene descriptions and enhancing map accessibility.
Source link