Springe zum Hauptinhalt
Dezernat Personal

Open Research Position

Doctoral Researcher in Robotics (m, f, d)
(Salary grade E 13 TV-L, 100%)

Chemnitz University of Technology invites applications for a doctoral position in the Collaborative Research Centre (CRC 1410) Hybrid Societies – Humans Interacting with Embodied Technologies. Employment is offered until December 31st, 2023 starting as soon as possible.

Research Environment and Tasks: Research in the CRC 1410 Hybrid Societies pertains to the interaction with Embodied Digital Technologies (EDTs) as artificial bodies moving in real or virtual environments ( Interdisciplinary research in the CRC is guided by two leading questions to ultimately ensure smooth and efficient encounters between humans and embodied technologies in hybrid societies: What is required so that humans can coordinate with EDTs as smoothly as with conspecifics? How to design EDTs to meet these requirements? Cooperating researchers from psychology, engineering, computer science, human movement science, linguistics and gesture studies, sociology, law studies, physics, and mathematics scientifically inquire the interaction of humans with EDTs from the level of single actions up to the attribution of intentionality.

Project description: The aim is to move robots like humans in shared environments. When humans cooperate with robots, humans would prefer robot movements which appear similar to their own motion. Thus, the aim is to provide solutions for whole body motion planning in dynamic environments. Recently, reinforcement learning and deep reinforcement learning become popular methods in motion planning. In this project, we will deeply investigate modern machine learning techniques for smooth robot motions in shared environments.

Researchers in the CRC Hybrid Societies contribute to the CRC’s joint research activities and actively participate, for instance, in research colloquia, lecture series, and workshops. The doctoral position includes the enrollment in the CRC's doctoral program. We expect you to publish your scientific results in well ranked international journals and at highly recognized international conferences.


  • University degree (Master/Diploma or equivalent) in robotics, computer science, electrical engineering, physics or mechanical engineering
  • Basic knowledge in robotics, robot control. Experience in machine learning algorithms and tools is a plus
  • Implementation skills preferable in C/C++, Java, Python, Matlab
  • Very good skills in team work and good communication skills
  • Fluent in English (both written and spoken), German language skills are helpful but not a must
Application procedure:

Applicants should send their complete application documents including a motivation letter (1 page) with a brief description of personal qualifications and research interests, a tabular curriculum vitae, copies of degree certificates and academic transcripts, a publication list if applicable, and abstracts of Bachelor- and Master-/diploma theses (1-2 pages) preferably as a single pdf-file via email (stating: “CRC1410_C03”) to . The closing date for applications is March 3rd 2020. Please, do not include links in electronic applications. Please, send copies only. Original documents will not be returned.

Chemnitz University of Technology
Faculty of Electrical Engineering and Information Technology – Institute for Automation
Professorship of Robotics and Human-Machine Interaction
Prof. Dr.-Ing. Ulrike Thomas
09107 Chemnitz

Chemnitz University of Technology is committed to ensuring an environment that provides equal opportunities and promotes diversity. To increase the number of women working in science and teaching, applications by women with the required qualifications are explicitly desired. Persons with disabilities are encouraged to apply. They will be given preference if equally qualified.

Employment will be governed by the provisions of the German law on fixed-term contracts in academia (Wissenschaftszeitvertragsgesetz).

Information on the collection and processing of personal data is provided at