Oskar Allerbo

Oskar Allerbo

Post Doctoral Researcher, Mathematical Statistics

Department of Mathematicals
KTH Royal Institute of Technology
100 44 STOCKHOLM
SWEDEN

email: oallerbo(at)kth.se

I am interested in many different aspects of statistics/ML, including generalization, implicit and explicit reguralization, artificial neural networks, kernel methods, and sparsity.

Publications and Preprints

Allerbo, O., and Schön, T. B. (2025). Is Supervised Learning Really That Different from Unsupervised? arXiv:2505.11006. Code.
Allerbo, O. (2025). Changing the Kernel During Training Leads to Double Descent in Kernel Regression. arXiv:2311.01762. Code.
Allerbo, O. (2025). Fast Robust Kernel Regression through Sign Gradient Descent with Early Stopping. Electronic Journal of Statistics, 19(1), 1231-1285. Code.
Allerbo, O., Jonasson, J., and Jörnsten, R. (2023). Elastic Gradient Descent, an Iterative Optimization Method Approximating the Solution Paths of the Elastic Net. Journal of Machine Learning Research, 24(277), 1-53. Code.
Allerbo, O., and Jörnsten, R. (2022). Bandwidth Selection for Gaussian Kernel Ridge Regression via Jacobian Control. arXiv:2205.11956. Code.
Allerbo, O., and Jörnsten, R. (2021). Non-linear, Sparse Dimensionality Reduction via Path Lasso Penalized Autoencoders. Journal of Machine Learning Research, 22(283), 1-28. Code.
Allerbo, O., and Jörnsten, R. (2022). Flexible, Non-parametric Modeling Using Regularized Neural Networks. Computational Statistics, 37(4), 2029-2047. Code.