Human Robot Interaction

Research Vision

Our focus is on the interaction of humans and autonomous systems that move in shared physical spaces. The particular scenarios that instantiate this general theme address autonomous transport vehicles in industrial environments and shared autonomy for telemanipulation. In particular, the mutual recognition of actions and intentions between humans and the autonomous systems they interact with is absolutely crucial to ensure safety as well public acceptance of such technologies. Robots must be able to communicate intentions clearly understandable to humans in order to become a reliable, agile and adaptable co-worker.

Notable Results

Spatial Augmented Reality (SAR)

One point of interest is the intention communication of mobile robots operating in industrial scenarios. Freely navigating autonomous vehicles can appear unpredictable to human workers and thus cause stress and render joint use of the available space inefficient. We address this issue via on-board intention projection on the shared floor space for communication from robot to human. To this end, we equipped a research prototype of a robotic fork-lift with a LED projector to visualize internal state information and intents. The robot’s ability to communicate its intentions was evaluated in realistic situations where test subjects meet the robotic forklift. Our results show that already adding simple information, such as the trajectory and the space to be occupied by the robot in the near future, is able to effectively improve human response to the robot.

Eye Tracking

Our research showed that the SAR projection pattern influences the gaze of humans in the vicinity of the robot. Interestingly this does not seem to have an effect on the trajectories chosen by these humans. With our combination of eye tracking and path data analysis, it becomes evident that the projection does influence the humans attention, but not enough to change their walking behavior in the chosen scenario.

Shared Autonomy in Telemanipulation

Emerging factory scenarios that require the ability to effectively collaborate with human workers need to address a new class of problems that extend beyond robot co-workers but also consider a rich collaborative scenario. Such scenarios require robots capable of supporting humans not only for increased efficiency and reliability of processes, but also to support flexible task configurations with high levels of safety. In the MRO lab we developed a framework for real-time robot motion synthesis and control, which enables the definition of behaviors/tasks at different levels of specificity. The devised framework implements a transparent mapping of high-level user-space desiderata (such as, e. g., desired motions, stiffness behavior or exerted forces) to low-level actuator commands. This research direction will enable a new mode of robot control from a user perspective: allowing operators to specify desired robot behaviours at different levels of abstraction. We exploit these capabilities to ease the cognitive load on the tele-operator by delegating some responsibilities (e. g., obstacle avoidance, gripper alignment …) to the motion controller.

Active Projects

Key People

 
clear