Skip to main content
In today’s world, robots are mostly confined to basic, scripted tasks, stuck on factory floors and lacking a meaningful connection with their environment. The DAIR lab’s goal is to develop algorithms that allow robots to confidently interact with objects while navigating complex contact scenarios in a safe and dynamic way. One barrier to effective manipulation is the complicated nature of frictional contact, where sharp impulses and discontinuous states wreak havoc on carefully-crafted controllers.
My project, mentored by Professor Michael Posa, focused on incorporating machine learning into existing rigid body dynamics models to learn these phenomena. I started with a simple 1-dimensional model of a frictional block to understand how initial parameter estimates affected learning. We then moved to more complicated scenarios involving tumbling polygons, experimenting with how various contact models in the literature respond to training. Currently, we’re setting up a  real-world experiment using a Kuka arm to verify our methodology.
While working on these projects, I became familiar with many different technologies used in the lab. On the learning side, I used PyTorch extensively for its convenient automatic differentiation and deep learning capabilities. I also gained experience with actual hardware by working with the Kuka LBR iiwa arm, as well as setting up an AprilTag-based computer vision system for tracking manipulated objects. Finally, I gained invaluable experience in thinking critically about systems and overall could not have asked for a more rewarding research experience.
In today’s world, robots are mostly confined to basic, scripted tasks, stuck on factory floors and lacking a meaningful connection with their environment. The DAIR lab’s goal is to develop algorithms that allow robots to confidently interact with objects while navigating complex contact scenarios in a safe and dynamic way. One barrier to effective manipulation is the complicated nature of frictional contact, where sharp impulses and discontinuous states wreak havoc on carefully-crafted controllers.
My project, mentored by Professor Michael Posa, focused on incorporating machine learning into existing rigid body dynamics models to learn these phenomena. I started with a simple 1-dimensional model of a frictional block to understand how initial parameter estimates affected learning. We then moved to more complicated scenarios involving tumbling polygons, experimenting with how various contact models in the literature respond to training. Currently, we’re setting up a  real-world experiment using a Kuka arm to verify our methodology.
While working on these projects, I became familiar with many different technologies used in the lab. On the learning side, I used PyTorch extensively for its convenient automatic differentiation and deep learning capabilities. I also gained experience with actual hardware by working with the Kuka LBR iiwa arm, as well as setting up an AprilTag-based computer vision system for tracking manipulated objects. Finally, I gained invaluable experience in thinking critically about systems and overall could not have asked for a more rewarding research experience.