Sitemap

A list of all the posts and pages found on the site. For you robots out there is an XML version available for digesting as well.

Pages

Page Not Found

Page not found. Your pixels are in another canvas.

About

CV

Publications

Sitemap

Talks and presentations

Teaching

Posts

projects

Machine Learning in pulmonary function diagnostic

Published:

Analyzed a medical dataset of COPD patients in a team of four students (bachelor level); Developed explainable machine learning models using Python and Scikit-learn to distinguish between healthy patients and pre-stage patients;

publications

Memory Efficient Kernel Approximation for Non-Stationary and Indefinite Kernels

Published in 2022 International Joint Conference on Neural Networks (IJCNN), 2022

This paper originated from my bachelor thesis and is all about large scale kernel methods. We explored the memory efficient approximation of kernel matrices that lead to invalid kernels. We proposed to correct the eigenspectrum with a shift computed via the Lanczos-Iteration. This correction leads to stable downstream tasks and despite having a large approximation error, competitive scores on the tasks. Download paper here

Port-Hamiltonian Architectural Bias for Long-Range Propagation in Deep Graph Networks

Published in 2025 International Conference on Learning Representations (ICLR), 2025

This work tackles the challenge of long-range information propagation in graph neural networks. We proposed port-Hamiltonian Deep Graph Networks, a novel framework inspired by physics that models neural information flow using principles from Hamiltonian systems. By combining conservative and dissipative dynamics, we control how information spreads or fades across the graph. Despite its theoretical grounding, the method integrates easily into standard message-passing networks and shows strong empirical performance, especially in tasks that require effective long-range reasoning. Download paper here

talks

A Primer on Over-Squashing and Over-Smoothing Phenomena in Graph Neural Networks

Published:

In the last years, graph neural networks (GNN) have gained a lot of attraction in research. Thanks to their ability to embed graph structures, numerous downstream tasks can now be successfully addressed at the level of nodes, edges, and graphs. Despite their success, increasing the number of layers to capture long-term dependencies leads to over-squashing and over-smoothing of feature representations. After gaining intuition about why and where the problems occur, this talk focuses on strategies to overcome over-squashing. Inspired by the formulation of deep neural architectures as stable and non-dissaptive ODEs, long-term dependencies in the information flow can be preserved. Finally, an outlook on a different formulation in terms of Hamiltonian dynamics is given.

Rediscovering Chaos? Analysis of GPU Computing Effects in Graph-coupled NeuralODEs

Published:

Download Poster.pdf
One of the fundamental guidelines in research is to publish reproducible results. Without this core assumption, conclusions and decisions cannot be trusted. Also in the last years, GPU computing became the defacto default choice of computing platform. This paper studies the confounding effects of utilizing GPU accelerations, computing with finite precision and modeling chaotic systems. Especially when it comes to end-to-end learned NeuralODE models, the forward as well as the backward pass dynamics need to be stable and reliable reproducible. In this work, we investigate the phenomenon of non-reproducible computations despite fixing random seeds and data batches and reason how to retain the sweet spot of computing highly efficient, modeling complex dynamics and being deterministically reproducible.

Injecting Hamiltonian Architectural Bias into Deep Graph Networks for Long-Range Propagation

Published:

Link to Slides
Very happy to be invited by the Machine Learning for Simulation Science lab of the University of Stuttgart and NEC Labs Europe GmbH. Abstract: The dynamics of information diffusion within graphs is a critical open issue that heavily influences graph representation learning, especially when considering long-range propagation. This calls for principled approaches that control and regulate the degree of propagation and dissipation of information throughout the neural flow. Motivated by this, we introduce (port-)Hamiltonian Deep Graph Networks, a novel framework that models neural information flow in graphs by building on the laws of conservation of Hamiltonian dynamical systems. We reconcile under a single theoretical and practical framework both non-dissipative long-range propagation and non-conservative behaviors, introducing tools from mechanical systems to gauge the equilibrium between the two components. Our approach can be applied to general message-passing architectures, and it provides theoretical guarantees on information conservation in time. Empirical results prove the effectiveness of our port-Hamiltonian scheme in pushing simple graph convolutional architectures to state-of-the-art performance in long-range benchmarks.

teaching

Fall 2019 and Fall 2020, Student Teaching Assistant: Applied Numerics

Undergraduate course, THWS, Department of Computer Science and Business Informatics, 2019

Gave tutorial sessions for the bachelor computer science course on applied numerics and machine learning.

Fall 2022, Student Teaching Assistant: Stochastic Modelling

Undergraduate course, FAU, Department of Mathematics, 2022

Gave tutorial sessions, graded homeworks and exams for the bachelor mathematics course on stochastic modelling.