Tommaso d'Orsi

Email: tommaso (dot) dorsi (at) unibocconi (dot) it
Office: 4-h-02

I am a Research Fellow at Bocconi hosted by Luca Trevisan. I am also a visiting Researcher at Google.

I recently received my PhD from ETH Zurich where I was fortunate to have David Steurer as my advisor. During my PhD, I spent several months at Google research, Vincent Cohen-Addad was my host.

I am broadly interested in computer science, from both a theoretical and applied perspective. The bulk of my research lies somewhere in the intersection of algorithm design, learning theory, privacy and computational complexity.

Dissertation: Information-computation gaps in robust statistics [pdf]
Recepient of the ETH Medal, 2023.


Publications:

Max-Cut with ε-Accurate Predictions [arXiv]
with Vincent Cohen-Addad, Anupam Gupta, Euiwoong Lee and Debmalya Panigrahi, in submission.

A Near-Linear Time Approximation Algorithm for Beyond-Worst-Case Graph Clustering
with Vincent Cohen-Addad, and Aida Mousavifar, ICML 2024.

Multi-View Stochastic Block Models
with Vincent Cohen-Addad, Silvio Lattanzi, and Rajai Nasser, ICML 2024.

Perturb-and-Project: Differentially Private Similarities and Marginals
with Vincent Cohen-Addad, Alessandro Epasto, Vahab Mirrokni and Peilin Zhong, ICML 2024.

Private graphon estimation via sum-of-squares [arXiv]
with Hongjie Chen, Jingqiu Ding, Yiding Hua, Chih-Hung Liu and David Steurer, STOC 2024.

Private estimation algorithms for stochastic block models and mixture models [arXiv]
with Hongjie Chen, Vincent Cohen-Addad, Alessandro Epasto, Jacob Imola, David Steurer, and Stefan Tiegel, NeurIPS 2023 (spotlight).

Reaching the Kesten-Stigum Threshold in the Stochastic Block Model under Node Corruptions [arXiv]
with Jingqiu Ding, Yiding Hua, David Steurer, COLT 2023.

A Ihara-Bass formula for non-boolean matrices and strong refutations of random CSPs [arXiv]
with Luca Trevisan, CCC 2023.

Higher degree sum-of-squares relaxations robust against oblivious outliers [arXiv]
with Rajai Nasser, Gleb Novikov and David Steurer, SODA 2023.

On the well-spread property and its relation to linear regression [arXiv]
with Hongjie Chen, COLT 2022.

Fast algorithm for overcomplete order-3 tensor decomposition [arXiv]
with Jingqiu Ding, Chih-Hung Liu, David Steurer and Stefan Tiegel, COLT 2022.

Robust Recovery for Stochastic Block Models [arXiv]
with Jingqiu Ding, Rajai Nasser and David Steurer, FOCS 2021.

Consistent Estimation for PCA and Sparse Regression with Oblivious Outliers [arXiv]
with Chih-Hung Liu, Rajai Nasser, Gleb Novikov, David Steurer and Stefan Tiegel, NeurIPS 2021.

The Complexity of Sparse Tensor PCA [arXiv]
with Davin Choo, NeurIPS 2021.

Consistent regression when oblivious outliers overwhelm [arXiv]
with Gleb Novikov and David Steurer, ICML 2021.

Sparse PCA: Algorithms, Adversarial Perturbations and Certificates [arXiv]
with Pravesh Kothari, Gleb Novikov and David Steurer, FOCS 2020.

Coloring graphs with no clique immersion
with Paul Wollan, DM18.


Teaching:

2022:
Algorithms and Data structures (Head TA)
2021:
Algorithms and Data structures (Head TA)
Optimization for Data Science (TA)
2020:
Algorithms and Data structures (TA)
Optimization for Data Science (TA)
Presenting Theoretical Computer Science (TA)
2019:
Algorithms and Data structures (TA)
Optimization for Data Science (TA)
2018:
Algorithms and Data structures (TA)