Jump to main content
Press Office and Crossmedia Communications
University News
University News Research

How Science Becomes Even More Transparent

Chemnitz Professor of Computer Science Janet Siegmund shows new approaches to Open Science - Distinguished Paper Award at one of the most important software engineering conferences

A challenge to science and researchers, which is hardly present in the public, is the so-called replication crisis. It describes the lack of replicability and transparency in the methods and results of scientific research. The open science discussion shows several ways to overcome this crisis, of which one path is to make all essential elements of a research process publicly available. But which elements are essential for the research process? And what exactly does "replicability" mean? Is this even the right term when it comes to repeating and thus verifying scientific findings?

Based on this problem, Prof. Dr. Janet Siegmund, Head of the Professorship of Software Engineering, and her colleagues Dr. Ben Hermann, interim professor of IT security at University of Paderborn, and Dr. Stefan Winter, postdoc in the field of software systems at TU Darmstadt, presented a paper on how to deal with the replicability of findings in computer science. The project was led by Dr. Hermann in Paderborn. The paper, titled "Community Expectations for Research Artifacts and Evaluation Processes," was awarded an ACM SIGSOFT Distinguished Paper Award in a highly competitive process at ESEC/FSE 2020, one of the most important international conferences for software engineering. From 360 submissions, the program committee accepted 101 papers for publication, a small number of which ultimately received the award. This prize is awarded annually by the Association for Computing Machinery (ACM) only for the best papers each year.

Relevance of replication

"Replicability is essential for good, successful science. That's why the Open Science movement exists, to make the path of knowledge transparent and comprehensive," Prof. Janet Siegmund points out. Among other things, this requires the availability of artifacts. Artifacts include, for example, all data, scripts, code examples or evidence created or collected to arrive at a finding. "However, there are no uniform guidelines as to what these artifacts should look like," says Siegmund. Although there are efforts to make artifacts available in order to promote Open Science and ensure replicability, it is currently not clear how. "Artifact evaluation committees should help to ensure that artifacts are available for a long time, that is, that they do not disappear even if a researcher moves to a different university, and that data and findings always remain verifiable," summarizes Siegmund. To develop a starting point for possible standards for evaluation committees, Siegmund and her colleagues sent a questionnaire by e-mail.

All members of the artifact evaluation committees of important conferences and journals in computer science were queried - more than 1000 persons in total. Of these, the team received more than 250 sheets back. This is a very high evidence base, as the regular response rate is between 10 and 20 percent.

"We were very surprised by the high number of responses. We didn't expect this and it showed us that the scientific community is very interested in the topic," explains Dr. Hermann.

Expectations of the evaluation commissions

"In our survey, we found that there are different expectations of what artifacts should look like. Some of these expectations are very individual and often only implicit, for example, in the minds of the members of the artifact evaluation committees, the chairs of the committees, the authors and other groups," says Siegmund. What makes things more difficult is that terms, such as "replicability," "replication," and "reusability" are interpreted differently. This makes it more difficult to define uniform guidelines. "But that's exactly where we need to go," says Janet Siegmund. "These guidelines and expectations must be communicated explicitly and clearly."

"We also have to discuss what our goal in publishing artifacts is in computer science. Is it just about increasing transparency or do we expect other research groups to be able to build on these artifacts? There is currently no uniform opinion in this area, but the goal changes a lot in the creation and also the evaluation of artifacts," explains Ben Hermann.

As next steps, the team now wants to develop and evaluate uniform quality criteria. This can take the form of a "Registered Reports Track," for example, in which researchers publish a study plan before the actual data collection. In line with the Open Science Foundation, this also includes a protocol of what happens to the artifacts and how it is ensured that they remain available and usable in the long term. These and other approaches should contribute to making computer science research permanently more transparent and comprehensible.

Original Publication (Preprint): Community Expectations for Research Artifacts and Evaluation Processes (Additional Material) By Ben Hermann, Stefan Winter, Janet Siegmund. This repository contains the material used in and produced during the study Community Expectations for Research Artifacts and Evaluation Processes accepted at the ACM Joint European Software Engineering Conference and Symposium on the Foundations of Software Engineering (ESEC/FSE) 2020: https://bhermann.github.io/artifact-survey/

Multimedia: In the podcast TUCpersönlich (german version only), Siegmund talks about this particular research interest, how she experiences the change between Passau, her former place of work, and Chemnitz, and what experiences she has had abroad.

Additional information is available from Prof. Dr. Janet Siegmund, Chair of Software Engineering at Chemnitz University of Technology, Tel. +49 371 531 34310, E-Mail janet.siegmund@informatik.tu-chemnitz.de.

(Author: Matthias Fejes/Translation: Chelsea Burris)

Matthias Fejes
25.09.2020

More articles for:

All "University News" articles