skip to content
Dominik Schröder


I am a postdoctoral researcher and lecturer at ETH Zurich. My research interests are probability theory (in particular random matrices), statistical physics, and and statistical learning theory.


  1. BSc Mathematics

    ETH Zürich

  2. BSc Mathematics

    LMU Munich

  3. MSc Theoretical and Mathematical Physics

    LMU Munich

  4. MASt Mathematics

    University of Cambridge

  5. PhD Mathematics

    IST Austria

  6. Industry Sabbatical

    Bosch Center for Artificial Intelligence

  7. Junior Fellow

    ETH Institute for Theoretical Studies

  8. SNF Ambizione Fellow

    ETH Zurich

Download CV

Selected research

  1. Phase Transition in the Density of States of Quantum Spin Glasses

    László Erdős, Dominik Schröder

    Math. Phys. Anal. Geom.Vol. 17 (2014) | Issue 3-4

    We demonstrate a transition between Gaussian and semicircular laws using q-Hermite polynomials. This work has inspired large amounts of follow-up research on the SYK model for quantum gravity.

  2. Random matrices with slow correlation decay

    László Erdős, Dominik Schröder

    Forum Math. SigmaVol. 7 (2017)

    We prove universality for a large class of random matrices with correlated entries. This very general result has been used numerous times, also in more applied research.

  3. Central limit theorem for linear eigenvalue statistics of non-Hermitian random matrices

    Giorgio Cipolloni, László Erdős, Dominik Schröder

    Comm. Pure Appl. Math.Vol. 76 (2023)

    We show that the linear statistics of random matrices with IID entries asymptotically are a rank-one perturbation of the Gaussian free field on the unit disc.

  4. Deterministic equivalent and error universality of deep random features learning

    Dominik Schröder, Hugo Cui, Daniil Dmitriev, Bruno Loureiro


    We show that the generalization error of deep random feature models is the same as the generalization error of Gaussian features with matched covariance, and derive an explicit expression for the generalization error.

  5. Asymptotics of Learning with Deep Structured (Random) Features

    Dominik Schröder, Hugo Cui, Daniil Dmitriev, Bruno Loureiro


    We derive an approximative formula for the generalization error of deep neural networks with structured (random) features, confirming a widely believed conjecture. We also show that our results can capture feature maps learned by deep, finite-width neural networks trained under gradient descent.

List of all publications

Selected projects

List of all projects


ETH Zurich
Rämistrasse 101
HG E 66.1
8092 Zurich