Research Interests
Our group is interested in algorithms for machine learning, especially optimization algorithms. Optimization is at the heart of machine learning. Most machine learning problems entail solving some kind of optimization problem. This holds true for classical as well as for deep learning approaches. We design efficient algorithms, prove correctness, implement them (usually in Python), and provide them to the public.
Projects
-
MatrixCalculus.org A tool for computing derivatives of matrix and tensor expressions.
-
geno-project.org A modeling language tool for solving constrained optimization problems easily. Automatically generates highly efficient Python code.
-
genosolver A highly efficient solver for solving constrained optimization problems fully written in Python with optional GPU support. No dependencies on external libraries (except NumPy).
Group Members
-
Sören Laue (head of the group)
-
Matthias Mitterreiter (PhD student)
-
Antonio Jovanović (PhD student)
-
Silvana Marmeggi (PhD student)
-
Latofat Bobojonova (Hiwi)
Publications
A full list can be found here.
-
M. Mitterreiter, M. Koch, J. Giesen, S. Laue.
Why Capsule Neural Networks Do Not Scale: Challenging the Dynamic Parse-Tree Assumption, AAAI 2023. -
J. Giesen, J. Klaus, S. Laue, N. Merk, and K. Wiedom.
Convexity Certificates from Hessians, NeurIPS 2022. -
S. Laue, M. Blacher, and J. Giesen.
Optimization for Classical Machine Learning Problems on the GPU, AAAI 2022. -
S. Laue, M. Mitterreiter, and J. Giesen.
A Simple and Efficient Tensor Calculus, AAAI 2020. -
S. Laue, M. Mitterreiter, and J. Giesen.
GENO - GENeric Optimization for Classical Machine Learning, NeurIPS 2019. -
J. Giesen, S. Laue, A. Loehne, and Ch. Schneider.
Using Benson’s Algorithm for Regularization Parameter Tracking, AAAI 2019. -
S. Laue, M. Mitterreiter, and J. Giesen.
Computing Higher Order Derivatives for Matrix and Tensor Expressions, NeurIPS 2018. -
K. Blechschmidt, J. Giesen, and S. Laue.
Tracking of Approximate Solutions of Parameterized Optimization Problems over Multi-Dimensional (Hyper-)Parameter Domains, ICML 2015. -
J. Giesen, S. Laue, and P. Wieschollek.
Robust and Efficient Kernel Hyperparameter Paths with Guarantees, ICML 2014. -
J. Giesen, S. Laue, J. Mueller, and S. Swiercy.
Approximating Concavely Parameterized Optimization Problems, NIPS 2012. -
S. Laue.
A Hybrid Algorithm for Convex Semidefinite Optimization, ICML 2012.
Teaching
- Optimization for Machine Learning (WS22/23)
Contact
Phone: +49 631 205 2509
TU Kaiserslautern
Dept. of Computer Science
Building 48, Room 655
Kaiserslautern, Germany