Research Interests
Our group is interested in algorithms for machine learning, especially optimization algorithms. Optimization is at the heart of machine learning. Most machine learning problems entail solving some kind of optimization problem. This holds true for classical as well as for deep learning approaches. We design efficient algorithms, prove correctness, implement them (usually in Python), and provide them to the public.
We also have a wider interest in topics that connect optimization and related subjects, like algorithmical fairness, explainable artificial intelligence, and their contribution to digital responsibility.
Projects

MatrixCalculus.org A tool for computing derivatives of matrix and tensor expressions.

genoproject.org A modeling language tool for solving constrained optimization problems easily. Automatically generates highly efficient Python code.

genosolver A highly efficient solver for solving constrained optimization problems fully written in Python with optional GPU support. No dependencies on external libraries (except NumPy).
Group Members

Sören Laue (head of the group)

Anne Awizen (administration)

Michaela Regneri (Senior Researcher)

Matthias Mitterreiter (PhD student)

Tomislav Prusina (PhD student)

Alhassan Abdelhalim (student assistent)

Merle Mosch (student assistent)
Publications
A full list can be found here.

M. Blacher, J. Giesen, J. Klaus, Ch. Staudt, S. Laue, V. Leis.
Efficient and Portable Einstein Summation in SQL, SIGMOD 2023. 
M. Mitterreiter, M. Koch, J. Giesen, S. Laue.
Why Capsule Neural Networks Do Not Scale: Challenging the Dynamic ParseTree Assumption, AAAI 2023. 
J. Giesen, J. Klaus, S. Laue, N. Merk, and K. Wiedom.
Convexity Certificates from Hessians, NeurIPS 2022. 
S. Laue, M. Blacher, and J. Giesen.
Optimization for Classical Machine Learning Problems on the GPU, AAAI 2022. 
S. Laue, M. Mitterreiter, and J. Giesen.
A Simple and Efficient Tensor Calculus, AAAI 2020. 
S. Laue, M. Mitterreiter, and J. Giesen.
GENO  GENeric Optimization for Classical Machine Learning, NeurIPS 2019. 
J. Giesen, S. Laue, A. Loehne, and Ch. Schneider.
Using Benson’s Algorithm for Regularization Parameter Tracking, AAAI 2019. 
S. Laue, M. Mitterreiter, and J. Giesen.
Computing Higher Order Derivatives for Matrix and Tensor Expressions, NeurIPS 2018. 
K. Blechschmidt, J. Giesen, and S. Laue.
Tracking of Approximate Solutions of Parameterized Optimization Problems over MultiDimensional (Hyper)Parameter Domains, ICML 2015. 
J. Giesen, S. Laue, and P. Wieschollek.
Robust and Efficient Kernel Hyperparameter Paths with Guarantees, ICML 2014. 
J. Giesen, S. Laue, J. Mueller, and S. Swiercy.
Approximating Concavely Parameterized Optimization Problems, NIPS 2012. 
S. Laue.
A Hybrid Algorithm for Convex Semidefinite Optimization, ICML 2012.
Teaching
 Machine Learning (SS23)
 Optimization for Machine Learning (WS22/23)
Contact
Universität Hamburg
Dept. of Computer Science
VogtKöllnStr. 30, G233
22527 Hamburg, Germany