Jump to main content
Women in Data Science Chemnitz
Speakers

Featured Speakers


Bianca Scheffler-Loth

(Swiss Re)

"The importance of AI, data and technology in large companies - a personal travelog"
tba

Josefine Umlauft

(Universität Leipzig, ScaDS.AI)

"Machine Learning for Earth System Sciences - Potential and Challenges"

The most pressing questions of our time involve a profound understanding and description of the impacts of climate change risks, including extreme weather events, biodiversity loss,and hazardous near-surface environmental processes. Machine Learning (ML) and big data analytics are becoming increasingly important in this context as most parts of the Earth system are continuously monitored by sensors, and ML is able to cope with both the volume of data and the heterogeneous data characteristics. For instance, satellites,drones, and sensor networks monitor the atmosphere, land, and ocean surfaces, including air, water, soils, rocks, and biodiversity or even the deep Earth’s interior, with unprecedented accuracy. Furthermore, citizen science projects collect data with smartphone apps and enrich our data archives. All in all, studying the Earth system and its challenges has become a data-intensive research problem.

However, Earth System Science data is highly heterogeneous with fundamental differences in resolution, scale, and data type making common analysis a multimodal and multiscale problem from a computer science perspective. To enable Machine Learning that respects the characteristics of geo-data, our community pushes towards interoperable tools and workflows that go beyond traditional methods and allow for integration and common analysis of e.g., sensor data, in-situ measurements, remotely sensed images, or large atmospheric models.

In this talk, I will provide an overview of the current developments at the cross-section of Machine Learning and Earth System Sciences, including potential and challenges.


Andrea Walther

(Humboldt-Universität zu Berlin)

"Nonsmooth optimization for machine learning"
tba

Carolin Penke

(Forschungszentrum Jülich)

"An Introduction to Large Language Models"
Large Language Models (LLMs) have revolutionized the field of artificial intelligence, enabling advanced text generation and understanding. This talk provides a concise overview of LLMs, focusing on their development, architecture, and implementation. We explain key concepts, and give details on the backbone of modern LLMs: the transformer architecture and its innovative attention mechanism. To be able to train these models on supercomputers, advanced parallelization techniques are needed. Recent advancements and promising trends are identified. Through the lens of the OpenGPT-X project, this presentation will highlight the collaborative efforts in developing multilingual, open-source LLMs.

Josephine Thomas

(Universität Kassel)

"The Power of Graphs - Graph Neural Networks for an Effective Power Grid and the Optimization of Printed Circuit Boards"

Graphs are ubiquitous in nature and can therefore serve as models for many practical but also theoretical problems. E.g. social networks, the power grid, or the network of internet providers are examples of graphs relevant to society. Thus, graphs are used increasingly in machine learning and data science. For example, data in graph form is used to predict the estimated arrival time in Google Maps and to predict interactions between compounds to find new drugs. The method most commonly used to deal with graph data, is graph neural networks. In this talk, I would like to give you a brief introduction to the concept of graphs, different graph types, as well as graph neural networks and then dive into two applications. First, the modeling of problems on the power grid. This is a topic highly relevant to decarbonization since it is not trivial to include increasingly many decentralized and weather-dependent sources of renewable energy into an effective power grid.  And second, the automatic optimization of printed circuit boards.  Printed circuit boards (PCBs) are used in all our electronic devices and so far their optimization takes a lot of time for an experienced engineer, while unoptimized PCBs lead to ineffective devices with a short life cycle. Since engineers are costly for companies and there is a shortage of engineers in the labor market, automatization is necessary.


Career Panel

Bianca Scheffler-Loth

(Swiss Re)

Andrea Walther

(Humboldt-Universität zu Berlin)


Poster Sessions

Marika Kaden

(HS Mittweida, SICIM)

"Biologically-informed shallow classification learning and interpretation"

We suggest employing a shallow neural network with a biologically-inspired design as an alternative to the commonly used deep neural network architecture in the context of bio-medical classification learning. Our focus is on the Generalized Matrix Learning Vector Quantization (GMLVQ) model, which serves as a robust and interpretable shallow neural classifier. This model relies on class-dependent prototype learning and adapts matrices for effective data mapping. To infuse biological knowledge, we tailor the matrix structure in GMLVQ based on pathway knowledge relevant to the specific problem at hand. Throughout model training, both the mapping matrix and class prototypes undergo optimization. Due to its inherent interpretability, GMLVQ facilitates straightforward model interpretation, explicitly considering pathway knowledge. Additionally, the model's robustness is ensured through implicit separation margin optimization achieved via stochastic gradient descent learning. — To illustrate the efficacy and interpretability of the shallow network, we showcase its performance on a prostate cancer research dataset. This dataset has previously been examined using a biologically-informed deep neural network

Melanie Kircheis

(TU Chemnitz)

"Fast and direct inverse nonequispaced Fourier transforms"

The well-known discrete Fourier transform (DFT) can easily be generalized to arbitrary nodes in the spatial domain. The fast procedure for this generalization is referred to as nonequispaced fast Fourier transform (NFFT). Various applications such as MRI, solution of PDEs, etc., are interested in the inverse problem, i.,e., computing Fourier coefficients from given nonequispaced data. In contrast to iterative procedures, where multiple iteration steps are needed for computing a solution, we focus especially on so-called direct inversion methods. We review density compensation techniques and introduce a new scheme that leads to an exact reconstruction for trigonometric polynomials. In addition, we consider a matrix optimization approach using Frobenius norm minimization to obtain an inverse NFFT.

Joint work with Daniel Potts (Chemnitz University of Technology, Germany).


Angela Thränhardt

(TU Chemnitz)

"Simulations in Theoretical Physics at TU Chemnitz"

An overview of the activities of the group "Theoretical Physics - Simulation of new materials" is given, including considerations of a Chemnitz based work of art as well as helping develop optical cochlear implants and looking at transport processes in organic solar cells.

Josie König

(University of Potsdam)

"Efficient training of Gaussian processes with tensor product structure"

Gaussian processes with the covariance matrix given as the sum of possibly multiple Kronecker products appear in spatio-temporal magnetoencephalography or climate data sets. This structure allows the covariance matrix to be identified as a tensor, which is used to represent this operator and the training data in the tensor train format. Determining the optimal set of hyperparameters of a Gaussian process based on a large amount of training data requires both linear system solves and trace estimation. In particular, solving a linear system with the covariance tensor is a major bottleneck and requires appropriate preconditioning.

Joint work with Max Pfeffer (University of Göttingen, Germany) and Martin Stoll (Chemnitz University of Technology, Germany)


Amel Alhassan

(RWTH Aachen University)

"Extracting the motif of a periodic image using variational methods"

Images of periodic structures are common in various fields ranging from ultra-high resolution images of crystallographic materials to images of some archaeological pattern and including periodically repetitive wallpaper and textile patterns. In this poster, we introduce a variational approach to automatically extract of the average motif of a repetitive pattern image. It also determines the unit cell vectors. The objective function uses the projection operator that projects each position in the image to its correspondence in the primitive unit cell (the motif). It then finds the average motif by minimizing the difference between the value of the image at each position and its projection. We give examples for the application of this approach in images of crystalline materials as well as some periodic wallpaper images.


Fatima Antarou Ba

(Technical University Berlin)

"Sparse additive function decompositions facing basis transforms"
High-dimensional real-world systems can often be well characterized by a small number of simultaneous low-complexity interactions. The analysis of variance (ANOVA) decomposition and the anchored decomposition are typical techniques to find sparse additive decompositions of functions. In this paper, we are interested in a setting, where these decompositions are not directly spare, but become so after an appropriate basis transform. Noting that the sparsity of those additive function decompositions is equivalent to the fact that most of its mixed partial derivatives vanish, we can exploit a connection to the underlying function graphs to determine an orthogonal transform that realizes the appropriate basis change. This is done in three steps: we apply singular value decomposition to minimize the number of vertices of the function graph, and joint block diagonalization techniques of families of matrices followed by sparse minimization based on relaxations of the zero ''norm'' for minimizing the number of edges. For the latter one, we propose and analyze minimization techniques over the manifold of special orthogonal matrices. Various numerical examples illustrate the reliability of our approach for functions having, after a basis transform, a sparse additive decomposition into summands with at most two variables.