Skip to main content

Mondelli Group

Data Science, Machine Learning, and Information Theory

We are at the center of a revolution in information technology, with data being the most valuable commodity. Exploiting this exploding number of data sets requires to address complex inference problems, and the Mondelli group works to develop mathematically principled solutions.

These inference problems span different fields and arise in a variety of applications coming from engineering and natural sciences. In particular, the Mondelli group focuses on wireless communications and machine learning. In wireless communications, given a transmission channel, the goal is to send information encoded as a message while optimizing for certain metrics, such as complexity, reliability, latency, throughput, or bandwidth. In machine learning, given a model for the observations, the goal is to understand how many samples convey sufficient information to perform a certain task and what are the optimal ways to utilize such samples. Both the vision and the toolkit adopted by the Mondelli group are inspired by information theory, which leads to the investigation of the following fundamental questions: What is the minimal amount of information necessary to solve an assigned inference problem? Given this minimal amount of information, is it possible to design a low-complexity algorithm? What are the fundamental trade-offs between the parameters at play (e.g., dimensionality of the problem, size of the data sample, complexity)?




Team

Image of Al Depope

Al Depope

PhD Student

Image of Clementine Domine

Clementine Domine

Postdoc


Image of Yegor Gorodzha

Yegor Gorodzha

Scientific Intern

Image of Emilie Gregoire

Emilie Gregoire

Academic Visitor

Image of Filip Kovacevic

Filip Kovacevic

PhD Student


Image of Kevin Kögler

Kevin Kögler

PhD Student

Image of Valentin Roth

Valentin Roth

Scientific Intern

Image of Peter Sukenik

Peter Sukenik

PhD Student



Current Projects

Fundamental limits and efficient algorithms for deep learning | Non-convex optimization in high-dimensions | Optimal code design for short block lengths


Publications

Gozeten HA, Ildiz ME, Zhang X, Soltanolkotabi M, Mondelli M, Oymak S. 2025. Test-time training provably improves transformers as in-context learners. Proceedings of the 42nd International Conference on Machine Learning. ICML: International Conference on Machine Learning, PMLR, vol. 267, 20266–20295. View

Zhang Y, Ji HC, Venkataramanan R, Mondelli M. 2025. Spectral estimators for structured generalized linear models via approximate message passing. Mathematical Statistics and Learning. 8(3–4), 193–304. View

Bombari S, Mondelli M. 2025. Spurious correlations in high dimensional regression: The roles of regularization, simplicity bias and over-parameterization. Proceedings of the 42nd International Conference on Machine Learning. ICML: International Conference on Machine Learning, PMLR, vol. 267, 4839–4873. View

Wu D, Mondelli M. 2025. Neural collapse beyond the unconstrained features model: Landscape, dynamics, and generalization in the mean-field regime. Proceedings of the 42nd International Conference on Machine Learning. ICML: International Conference on Machine Learning, PMLR, vol. 267, 67499–67536. View

Kovačević F, Yihan Z, Mondelli M. 2025. Spectral estimators for multi-index models: Precise asymptotics and optimal weak recovery. Proceedings of 38th Conference on Learning Theory. COLT: Conference on Learning Theory, PMLR, vol. 291, 3354–3404. View

View All Publications

ReX-Link: Marco Mondelli


Career

Since 2025 Professor, Institute of Science and Technology Austria (ISTA)
2019 – 2025 Assistant Professor, Institute of Science and Technology Austria (ISTA)
2017 – 2019 Postdoc, Stanford University, Stanford, USA
2018 Research Fellow, Simons Institute for the Theory of Computing, Berkeley, USA
2016 PhD, EPFL, Lausanne, Switzerland 


Selected Distinctions

2019 Lopez-Loreta Prize
2018 Simons-Berkeley Research Fellowship
2018 EPFL Doctorate Award
2017 Early Postdoc Mobility Fellowship, Swiss National Science Foundation
2016 Best Paper Award, ACM Symposium on Theory of Computing (STOC)
2015 Best Student Paper Award, IEEE International Symposium on Information Theory (ISIT)
2015 Dan David Prize Scholarship


Additional Information

Download CV
View Marco Mondelli’s website



theme sidebar-arrow-up
Back to Top