Files

Abstract

Physical guidance is a natural interaction capability that would be beneficial for mobile robots. However, placing force sensors at specific locations on the robot limits where physical interaction can occur. This paper presents an approach that uses torque data from four compliant steerable wheels of an omnidirectional non-holonomic mobile platform, to respond to physical commands given by a human. The use of backdrivable and torque-controlled elastic actuators for active steering of this platform intrinsically provides the capability of perceiving applied forces directly from its locomotion mechanism. In this paper, we integrate this capability into a control architecture that allows users to force-guide the platform with shared-control ability, i.e., having the platform being guided by the user while avoiding obstacles and collisions. Results using a real platform demonstrate that user’s intent can be estimated from the compliant steerable wheels, and used to guide the platform while taking nearby obstacles into consideration.

Details

Actions