FRR: Towards Robust and Perceptual Inclusive Mobile Robots
Sponsor: National Science Foundation (NSF)
Award Number: 2152077
PI: Eshed Ohn-Bar
Abstract:As prototypical intelligent mobile systems, from autonomous vehicles to delivery robots, move from their controlled development labs into the real-world, their impact on individuals with disabilities becomes discernible. An intelligent system that fails to account for diverse reactions and mobility characteristics among individuals can have dire consequences. For example, a delivery robot may inadvertently cause a safety-critical scenario by stopping in a manner which blocks a wheelchair user in traffic. Collision-free navigation in complex urban scenarios alongside blind individuals depends on the ability of the system to consider factors related to non-visual reasoning and cane mobility strategies in order to make precise future predictions. Yet, current autonomous systems cannot effectively differentiate among mobility aids and needs, nor can they reason over accessibility implications of various situations, from stairs to road layouts and ambient conditions. To teach systems to navigate while safely interacting with individuals with disabilities, the goal of this project is to develop comprehensive perception, understanding, and decision-making capabilities for accessibility and needs-aware interactive mobile systems. Moreover, through fundamental advancements in inclusive mobile systems that can robustly understand diverse needs of humans in their surroundings, this project has far-reaching implications for assisting and empowering millions of people to achieve greater quality-of-life. By enabling mobile platforms to interactively adapt to meet inherently diverse needs, project outcomes will facilitate broad usability, trust, and reduction of social and physical barriers that prevent individuals with disabilities from integrating into society.
This project will address fundamental challenges in realizing generalized accessibility-driven intelligent systems that can understand the diverse needs of individuals with disabilities. The work tackles foundational advancements in benchmarks, models, and techniques for closing the perception-to-action loop in the context of inclusive navigational and mobility systems. Project outcomes include novel multi-task learning frameworks for robust, fine-grained, and expressive vision-based decision-making navigation policies that can be efficiently adapted to optimize for safety and accessibility constraints. Given underlying issues in data scarcity, annotation, and sharing, a main goal lies in realizing standardized interactive development framework with detailed recognition tasks and customization scenarios relevant to accessibility. Framework development and model design will be extensively informed through real-world user studies and collaboration with orientation and mobility instructors. In addition, the introduced framework will broadly engage individuals with disabilities, students, developers, and educators to produce shared tools for training the next generation of engineers concepts needed to tackle multifaceted problems at the intersection of machine learning, perception, and accessibility. Thus, this research facilitates future deployment of generalized autonomous and assistive navigation systems that can seamlessly interact with and empower all people in their environment.
For more information, click here.