Tommaso d'Orsi
Email: tommaso (dot) dorsi (at) unibocconi (dot) it
Office: 2-C1-17
I am an Assistant Professor in the department of Computing Sciences at Bocconi.
I am broadly interested in computer science, from both a theoretical and applied perspective. My research focuses on algorithm design, computational complexity, learning theory and privacy.
I received my PhD from ETH Zurich where I was fortunate to have David Steurer as my advisor. After that,
I spent a year as a Research Fellow at Bocconi hosted by Luca Trevisan.
I am a frequent visitor of Google Research, where I spent part of my PhD and postdoc.
Teaching:
Winter 2024: (0874) Algorithms for Optimization and Inference [Blackboard]
Office hours: by appointment.
Publications:
Max-Cut with ε-Accurate Predictions
[arXiv]
with
Vincent Cohen-Addad,
Anupam Gupta,
Euiwoong Lee and
Debmalya Panigrahi,
NeurIPS 2024.
A Near-Linear Time Approximation Algorithm for Beyond-Worst-Case Graph Clustering
[arXiv]
with
Vincent Cohen-Addad,
and Aida Mousavifar,
ICML 2024.
Multi-View Stochastic Block Models
[arXiv]
with
Vincent Cohen-Addad,
Silvio Lattanzi,
and Rajai Nasser,
ICML 2024.
Perturb-and-Project: Differentially Private Similarities and Marginals
[arXiv]
with
Vincent Cohen-Addad,
Alessandro Epasto,
Vahab Mirrokni
and Peilin Zhong,
ICML 2024 (spotlight), TPDP 2024.
Private graphon estimation via sum-of-squares
[arXiv]
with
Hongjie Chen,
Jingqiu Ding, Yiding Hua, Chih-Hung Liu
and David Steurer,
STOC 2024.
Private estimation algorithms for stochastic block models and mixture models
[arXiv]
with
Hongjie Chen,
Vincent Cohen-Addad,
Alessandro Epasto,
Jacob Imola,
David Steurer,
and Stefan Tiegel,
NeurIPS 2023 (spotlight).
Information-computation gaps in robust statistics
[pdf]
Dissertation. Recepient of the ETH Medal 2023.
Reaching the Kesten-Stigum Threshold in the Stochastic Block Model under Node Corruptions
[arXiv]
with
Jingqiu Ding, Yiding Hua,
David Steurer,
COLT 2023.
A Ihara-Bass formula for non-boolean matrices and strong refutations of random CSPs
[arXiv]
with
Luca Trevisan, CCC 2023.
Higher degree sum-of-squares relaxations robust against oblivious outliers
[arXiv]
with
Rajai Nasser, Gleb Novikov and David Steurer,
SODA 2023.
On the well-spread property and its relation to linear regression
[arXiv]
with
Hongjie Chen, COLT 2022.
Fast algorithm for overcomplete order-3 tensor decomposition
[arXiv]
with
Jingqiu Ding, Chih-Hung Liu, David Steurer and Stefan Tiegel, COLT 2022.
Robust Recovery for Stochastic Block Models
[arXiv]
with
Jingqiu Ding, Rajai Nasser and David Steurer,
FOCS 2021.
Consistent Estimation for PCA and Sparse Regression with Oblivious Outliers
[arXiv]
with
Chih-Hung Liu, Rajai Nasser, Gleb Novikov, David Steurer and Stefan Tiegel,
NeurIPS 2021.
The Complexity of Sparse Tensor PCA
[arXiv]
with
Davin Choo,
NeurIPS 2021.
Consistent regression when oblivious outliers overwhelm
[arXiv]
with
Gleb Novikov and David Steurer,
ICML 2021.
Sparse PCA: Algorithms, Adversarial Perturbations and Certificates
[arXiv]
with
Pravesh Kothari, Gleb Novikov and David Steurer,
FOCS 2020.
Coloring graphs with no clique immersion
with
Paul Wollan,
DM18.