Navigation

Content Hotkeys

Human-Robot Interaction

Human-Robotic-Interaction (HRI)

Employment of robots in private and public environments is increasing. Especially the sales figures of industrial robots are rising. But robots are not only becoming faster, cheaper and more capable but also more flexible in attending different tasks and get increasingly more interactive with humans. Especially when robots and humans are executing working tasks cooperatively as a peer-to-peer group, a socially interactive robot is required.

An Emotion Model for a Social Robot

SocialRobot

For an effective execution of cooperative tasks the robot is required to understand people’s specific knowledge, strengths and weaknesses and to estimate and react to the peoples intentions and needs as well as the dynamics of object interaction with other robots and humans. The ability of understanding an intention is intimately linked to an emotional understanding. The important role emotions play in human-robot interaction and the fact that many learning tasks of social robots using a reinforcement algorithm led us to the application of the developed emotional model as a component of a social robot.

EmotionModel

The emotional model is shown in the picture above. Two types of numerical input are necessary for the emotion model. First, the planned action from the reinforcement learning algorithm for motion generation suggests a work task, which impacts the emotional model in three ways, according to previously established key factors about the human counterpart, e.g. experience and physiology. The second factor, the ergonomic assessment categorizes the work task as being feasible, precarious or alarming. Due to this, we are able to assess physiological constraints and infer possible reactions to upcoming movements. Thereby comparing body postures and assessing intentions like “carrying“, “walking“ etc. This assessment combined with the ergonomic evaluation about how difficult the aspired movement is going to be serves as the foundation of the emotional manifestation.

Publications:

Truschzinski, M., Müller, N., Dinkelbach, H., Protzel, P., Hamker, F., Ohler, P. (2014). Deducing human emotions by robots: Computing basic non-verbal expressions of performed actions during a work task. Proc. of IEEE International Symposium on Intelligent Control (ISIC), Antibes, France. To appear.

Müller, N., Truschzinski, M. (2014). An Emotional Framework for a Real-Life Worker Simulation Proc. of Human-Computer Interaction (HCI), Crete, Greece.

Truschzinski, M., Müller, N. (2014). An emotional model for social robots: late-breaking report. Proc. of the 2014 ACM/IEEE international conference on Human-robot interaction (HRI '14)., Bielefeld, Germany.

Research projects:

Learning the Robot by Human Gestures

Research projects:

  • MULI

Human-Roboter-Communication

  • MULI