For better human-machine understanding
Interdisciplinary research team investigates human-machine interaction through recognition of gestures and facial expressions
Smartphones, Smart-Home systems, or cashier machines in supermarkets: Fast technological developments provide constantly in intelligence increased technological solutions and are in almost all areas of life found. But fast and large changes can lead to an overwhelming feeling, especially in older people. They often have problems to understand and use new technologies. Thus, possible support and aid options of intelligent systems are often not fully exploited. The demographic change further exalts this effect.
The interdisciplinary research team comprises of researchers of the Professorships Artificial Intelligence, Computational Physics, Cognitive and Engineering Psychology, and Computer Graphics and Visualization of Chemnitz University of Technology. The team wants to develop a “cognitive system”, which allows machines to interpret human gestures and facial expressions adequate. The project “Sozial agierende, kognitive Systeme zur Feststellung von Hilfsbedürftigkeit” is funded by the European Social Fund (ESF) and operated by the Sächsische Aufbaubank (SAB) for a period of three years.
Insight in human-computer interaction through laboratory study
“Our project aims for the development of a cognitive system, which is able to identify the need of help of the user on the basis of observed parameters such as gestures or facial expressions”, says Fred Hamker, chair of the Professorship Artificial Intelligence and leader of the project. With the aid of cognitive systems, help can be offered whenever it is really needed. In order to implement the plans, the fields of psychology, computer engineering, and physics work together to identify the majority of indicators for the need for help.
The team conducted a first laboratory study to determine the potential of facial expressions, posture, user input, and spontaneous, verbal remarks as information source. This draws conclusions on the emotional and cognitive status of the user while controlling a computer system. For this purpose, Students and employees of Chemnitz University had to deal with varying levels of difficulty in a statistic program they know while their facial expressions and posture were videotaped with several cameras. Mouse movement and spontaneous verbal remarks were also recorded. Thus, it is possible to determine how different parameters change the need for help depending on level of frustration.
A by the Professorship of Artificial Intelligence specifically developed system analyses the faces of the test persons. Prominent markers in the face such as the corners of the mouth provide information on the emotional state. Furthermore, the posture of the test persons also provides information on their accomplishment of the task. “The machine will be taught a so-called “Memory Hidden Markov-Model” based on the data acquired to connect information and recognize recurring behavioral patterns and categorize the emotional state”, explains Kim Schmidt, research associate in the Professorship Computational Physics and project spokesman. If the aim of the project succeeds, a first and important step in technological systems for future identification and interpretation of human expressions would be made.
Chemnitz University of Technology promotes its core competency “Humans and Technology” in the framework of numerous research projects. In the “Strategy of Excellence”, initiated by federal and state governments, Chemnitz University applied for the cluster of excellence “Human factors in Technology: Mind, Movement, Embodiment”. Scientists of five Chemnitz University departments as well as from other universities and extramural research institutions participate in the project.
Further information: M.Sc. Kim Schmidt, Professorship Computational Physics at Chemnitz University, phone +49 371 531 39337, email email@example.com