Wearable devices for the visually impaired. Lessons learnt from autonomous robotics

Wearable devices for the visually impaired. Lessons learnt from autonomous robotics

Place: Large Lecture Room - CVC

Affiliation: Robot Vision Group, Computer Sci. Dep. Univ. Alicante, Spain  

Computer-vision based wearable devices (including phones, lightweight cameras and mini-laptops) play a valuable role in way-finding and indoor localization for the visually impaired. SLAM (Simultaneous Localization and Mapping) is a promising robotics tool for using real-time video information to obtain both the instantaneous location of the user and the map of where he/she has traveled so far, but it is a very hard task when real-time constraints are taken into account. We describe improvements in SLAM that allow processing at an acceptable frame-rate (accommodated to the normal walking rhythm of a pedestrian), as well as extensions to SLAM that help the user detect and avoid aerial obstacles (such as overhanging tree limbs). All these developments are the natural evolution of our research in the field of autonomous robotics. We will present applications related to indoor and outdoor navigation, underwater robots and quadcopters usind different camera systems like omnidirectional ones and interacting with other sensors like 3D lasers. Finally we will sketch our future research in the  wearable devices field.