Skip to main content

Solving Kernel Ridge Regression with Gradient Descent for a Non-Constant Kernel

Save to calendar

May 14

Date and time: 14 May 2024, 11:00-12:00 CEST
Speaker: Oskar Allerbo, KTH
Title: Solving Kernel Ridge Regression with Gradient Descent for a Non-Constant Kernel

Where: Room B21, floor 3, Brinnelvägen 23, KTH main campus, Stockholm

Meeting ID: 695 6088 7455

Moderator: Alexandre Proutiere,
Administrator: Bastien Dubail,

Abstract: Kernel ridge regression, KRR, is a generalization of linear ridge regression that is non-linear in the data but linear in the parameters. Solving this convex problem iteratively with gradient descent opens up changing the kernel during training, which is investigated in this talk.

Based on our theoretical analysis, we propose an update scheme for the kernel during training and thus obtain a method that circumvents hyper-parameter selection and increases performance on new data. Like neural networks, this method can achieve zero training error, good generalization (benign overfitting), and a double descent behaviour. We also apply our insights from KRR to neural networks through the neural tangent kernel, NTK, interpretation.

By appropriately modifying the time-dependent NTK, we can improve both the training speed and the generalization of the final model. No prior knowledge of kernel methods is assumed.

Bio: Oskar Allerbo received his MSc in physics from Chalmers University of Technology in 2010. After seven years in the industry at Volvo and Ericsson, he began his PhD studies in mathematical statistics at the University of Gothenburg in 2017, mainly focusing on different aspects of non-linear regression, including neural networks, kernel methods, sparsity and early stopping. Since 2023, he has been a postdoctoral researcher in mathematical statistics at KTH, trying to extract information about the blood glucose level from the electrical signals in the vagus nerve as a part of MedTechLabs.

Oskar Allerbo on LinkedIn