Autonomous Navigation of a Socially Assistive Robot for Telepresence Rehabilitation in Hospital Environment

visual example of how autonomously navigated robot constructs a map


Engineering and Applied Sciences


Assistant Professor of Physical Medicine and Rehabilitation

Project Summary

Lil’Flo is a low-cost socially assistive robot under development in the UPenn Rehabilitation Robotics Lab, designed to interact with Cerebral Palsy (CP) patients. The goal of our project this summer is to create a functional navigation demo with the Robot Operating System (ROS) framework on an iClebo Kobuki base and to propose a more sophisticated system based on how the demo performs inside hospital settings. For the navigation demo, we want the robot to move autonomously given its whereabouts and destination information. The process consists of four parts: mapping, localization, path planning and object avoidance. We used code packages from ROS Navigation Stack, including gmapping (to create a 2D occupancy grid map), amcl (to implement adaptive Monte Carlo localization) and move_base (to implement a local and a global path planner). Based on the characteristics of hospital rooms and problems we encountered during the demo, we designed a more sophisticated navigation system. The system will have a multi-layer map that contains both the probabilistic layer and a topological graph; the robot will localize based on room numbers recognized in the corridors with deep learning techniques; a GUI interface will be created to make it easier for doctors to label rooms and to give instructions. If we can finish the implementations in the future, Lil’Flo will be able to navigate the hospital freely with little human assistance, making patient interactions much more efficient.

Through the 10-week research, I’ve gained a lot of experience in the field of robot navigation. I encountered many interesting problems, such as SLAM (simultaneous localization and mapping), and learned how to acquire new ideas from reading research papers and come up with new solutions that address our unique problem. The Lil’Flo project greatly enhanced my educational experience because I have always been interested in robotics, and this is my first chance to get my hands dirty in a robotics lab. I was also exposed to lot of new and exciting branch of computer science, such as deep learning and computer vision. I was able to confirm my love for robotics from the research experience and will remain working in the Rehab Robotics Lab through the school year and explore more of navigation, rehab technologies and robotics.