Ilias Zadik

Yale University,
Department of Statistics and Data Science.

Contact: ilias.zadik at yale.edu
Links to: CV (last update 10/23/22), Google Scholar, arXiv.

headshot
About me

I am an Assistant Professor at Yale University, in the Department of Statistics and Data Science.

I am searching for highly motivated PhD students. If you are interested to work with me, apply to our PhD program and mention my name.

Currently teaching (Fall 2024): S&DS 688 01: Computational and statistical trade-offs in high dimensional statistics

Research Interests
My research lies broadly in the interface of high dimensional statistics, the theory of machine learning, the theory of computation, and probability theory. A lot of my work has the goal to build and use mathematical tools to bring insights into the computational and statistical challenges of modern machine learning tasks. Four directions that I have been recently focusing on are:

  • Computational-statistical trade-offs in inference (see papers 4, 12, 16, 20, 22, 24, 27, 28, 33, 34, 36 below).
  • Phase transition phenomena (e.g. the "All-or-Nothing phenomenon") (see papers 9, 11, 13, 17, 19, 31, 32, 33 below).
  • Connections between statistics and cryptographic methods (see papers 7, 18, 20, 29, 35 below).
  • The cost of (differential) privacy in statistics (see papers 6, 14, 21 below).


Short Bio (prior to current position)
From September 2021 to August 2023 I was a postdoctoral researcher at Massachussets Institute of Technology (MIT), Mathematics Department, where I worked under the wonderful mentorship of Elchanan Mossel and Nike Sun, as a member of the NSF/Simons program Collaboration on the Theoretical Foundations of Deep Learning.
Prior to this, from September 2019 to August 2021 I was a postdoctoral fellow at the Center for Data Science of New York University and a member of it's Math and Data (MaD) group.
I received my PhD on September 2019 from the Operations Research Center of MIT , where I was very fortunate to be advised by Prof. David Gamarnik. A copy of my PhD thesis can be found here.
From June 2017 to August 2017 I was an intern at the Microsoft Research Lab in New England, mentored by Jennifer Chayes and Christian Borgs . Prior joining MIT, I completed a Master of Advanced Studies in Mathematics (Part III of the Mathematical Tripos) at the University of Cambridge and a BA in Mathematics from the Mathematics Department at the University of Athens.
Some recent recorded talks

Research papers (published or submitted)
(Note: the order of the authors is alphabetical, unless denoted by (*))
    2024+
  1. On The MCMC Performance In Bernoulli Group Testing And The Random Max Set Cover Problem
    ArXiv preprint
    Max Lovig, Ilias Zadik.
  2. On the hardness of learning one hidden layer neural networks
    ArXiv preprint
    Shuchen Li, Ilias Zadik, Manolis Zampetakis.
  3. On the Low-Temperature MCMC threshold: the cases of sparse tensor PCA, sparse regression, and a geometric rule
    ArXiv preprint
    Zongchen Chen, Conor Sheehan, Ilias Zadik.
  4. Sharp Thresholds Imply Circuit Lower Bounds: from random 2-SAT to Planted Clique
    ArXiv preprint
    David Gamarnik, Elchanan Mossel, Ilias Zadik.
  5. A Second Moment Proof of the Spread Lemma
    ArXiv preprint
    Elchanan Mossel, Jonathan Niles-Weed, Nike Sun, Ilias Zadik.
  6. On the Second Kahn-Kalai Conjecture (45mins video - Simons workshop)
    ArXiv preprint
    Elchanan Mossel, Jonathan Niles-Weed, Nike Sun, Ilias Zadik.
  7. Transfer Learning Beyond Bounded Density Ratios
    ArXiv preprint
    Alkis Kalavasis, Ilias Zadik, Manolis Zampetakis.
  8. Low-degree Security of the Planted Random Subgraph Problem
    Proceedings of the Theory of Cryptography Conference (TCC), 2024
    Andrej Bogdanov, Chris Jones, Alon Rosen, Ilias Zadik.
  9. Counting Stars is Constant-Degree Optimal For Detecting Any Planted Subgraph
    Proceedings of the Conference on Learning Theory (COLT), 2024
    Xifan Yu, Ilias Zadik, Peiyuan Zhang.
  10. The Landscape of the Planted Clique Problem: Dense Subgraphs and the Overlap Gap Property ( 1hr video - NYU Probability Seminar)
    Annals of Applied Probability, 2024
    David Gamarnik, Ilias Zadik.
  11. Stationary Points of Shallow Neural Networks with Quadratic Activation Function (30mins video by Eren - MIT MLTea)
    Mathematics of Operations Research, 2024
    David Gamarnik, Eren C. Kızıldağ, Ilias Zadik.
  12. 2023
  13. Sharp thresholds in inference of planted subgraphs
    Annals of Applied Probability, 2024+
    Conference version in Proceedings of the Conference on Learning Theory (COLT), 2023
    Elchanan Mossel, Jonathan Niles-Weed, Youngtak Sohn, Nike Sun, Ilias Zadik.
  14. Almost-Linear Planted Cliques Elude the Metropolis Process (1h video - BIRS workshop)
    Random Structures and Algorithms, 2024+
    Conference version in Proceedings of the Symposium on Discrete Algorithms (SODA), 2023
    Zongchen Chen, Elchanan Mossel, Ilias Zadik.
  15. Shapes and recession cones in mixed-integer convex representability
    Mathematical Programming, 2023
    Ilias Zadik, Miles Lubin, Juan Pablo Vielma. (*)
  16. 2022
  17. The Franz-Parisi Criterion and Computational Trade-offs in High Dimensional Statistics (5mins video - NeurIPS)
    Advances in Neural Information Processing Systems, (NeurIPS), 2022
    Selected for an Oral Presentation.
    Afonso Bandeira, Ahmed El Alaoui, Sam Hopkins, Tselil Schramm, Alex Wein, Ilias Zadik.
  18. Archimedes Meets Privacy: On Privately Estimating Quantiles in High Dimensions Under Minimal Assumptions
    (5mins video by Dan - NeurIPS,
    )
    Advances in Neural Information Processing Systems, (NeurIPS), 2022
    Omri Ben-Eliezer, Dan Mikulincer, Ilias Zadik.
  19. Lattice-Based Methods Surpass Sum-of-Squares in Clustering (20mins video by Min Jae - COLT)
    Proceedings of the Conference on Learning Theory (COLT), 2022
    Ilias Zadik, Min Jae Song, Alex Wein, Joan Bruna. (*)
  20. Statistical and Computational Phase Transitions in Group Testing (20mins video - COLT)
    Proceedings of the Conference on Learning Theory (COLT), 2022
    Amin Coja-Oghlan, Oliver Gebhard, Max Hahn-Klimroth, Alex Wein, Ilias Zadik.
  21. 2021
  22. On the Cryptographic Hardness of Learning Single Periodic Neurons (20 mins video - Deep Learning Theory Symposium, Simons)
    Advances in Neural Information Processing Systems, (NeurIPS), 2021
    Min Jae Song, Ilias Zadik, Joan Bruna. (*)
  23. It was “all” for “nothing”: sharp phase transitions for noiseless discrete channels (18mins video - COLT)
    IEEE Transactions of Information Theory, 2023
    Conference version in Proceedings of the Conference on Learning Theory (COLT), 2021
    Jonathan Niles-Weed, Ilias Zadik.
  24. Group testing and local search: is there a computational-statistical gap? (2hrs video by Fotis - IAS)
    Proceedings of the Conference on Learning Theory (COLT), 2021
    Fotis Iliopoulos, Ilias Zadik.
  25. Self-Regularity of Non-Negative Output Weights for Overparameterized Two-Layer Neural Networks
    IEEE Transactions of Signal Processing, 2021
    Conference version in Proceedings of the International Symposium on Information Theory (ISIT), 2021
    David Gamarnik, Eren C. Kızıldağ, Ilias Zadik.
  26. 2020
  27. Optimal Private Median Estimation under Minimal Distributional Assumptions (10mins video by Manolis - NeurIPS spotlight)
    Advances in Neural Information Processing Systems, (NeurIPS), 2020
    Selected for a Spotlight Presentation (~5% of submitted papers).
    Christos Tzamos, Manolis Vlatakis, Ilias Zadik.
  28. The All-or-Nothing Phenomenon in Sparse Tensor PCA (Poster, ( 25mins video - BIRS workshop)
    Advances in Neural Information Processing Systems, (NeurIPS), 2020
    Jonathan Niles-Weed, Ilias Zadik.
  29. Free Energy Wells and the Overlap Gap Property in Sparse PCA ( 25mins video - Simons workshop)
    Communications on Pure and Applied Mathematics, 2023
    Conference version in Proceedings of the Conference of Learning Theory (COLT), 2020
    Gèrard Ben Arous, Alex Wein, Ilias Zadik.
  30. 2019
  31. The All-or-Nothing Phenomenon in Sparse Linear Regression (Slides, Poster)
    Mathematical Statistics and Learning, 2021
    Conference version in the Proceedings of the Conference on Learning Theory (COLT), 2019
    Galen Reeves, Jiaming Xu, Ilias Zadik.
  32. All-or-Nothing Phenomena: From Single-Letter to High Dimensions
    Proceedings of the International Workshop on Computational Advances in Multi-Sensor Adaptive Processing (CAMSAP), 2019
    Galen Reeves, Jiaming Xu, Ilias Zadik.
  33. Improved bounds on Gaussian MAC and sparse regression via Gaussian inequalities
    Proceedings of the International Symposium on Information Theory (ISIT), 2019
    Ilias Zadik, Christos Thrampoulidis, Yury Polyanskiy. (*)
  34. A simple bound on the BER of the MAP decoder for massive MIMO systems
    Proceedings of the International Conference on Acoustics, Speech, and Signal Processing (ICASSP), 2019
    Christos Thrampoulidis, Ilias Zadik, Yury Polyanskiy. (*)
  35. 2018
  36. Inference in High-Dimensional Linear Regression via Lattice Basis Reduction and Integer Relation Detection
    IEEE Transactions of Information Theory, 2021
    Conference version with David Gamarnik, in Advances in Neural Information Processing Systems, (NeurIPS), 2018
    David Gamarnik, Eren C. Kızıldağ, Ilias Zadik.
  37. Revealing Network Structure, Confidentially: Improved Rates for Node-Private Graphon Estimation (Slides, 1h video by Adam -Simons)
    Proceedings of the Symposium on Foundations of Computer Science (FOCS), 2018
    Christian Borgs, Jennifer Chayes, Adam Smith, Ilias Zadik.
  38. Orthogonal Machine Learning: Power and Limitations (Slides, Poster, Code)
    Proceedings of International Conference of Machine Learning (ICML), 2018 (20 minute Presentation)
    Lester Mackey, Vasilis Syrgkanis, Ilias Zadik.
  39. 2017
  40. Sparse High-Dimensional Linear Regression. Estimating Squared Error and a Phase Transition.
    Annals of Statistics, 2022
    David Gamarnik, Ilias Zadik.
    This paper merges:
    (a) High-Dimensional Regression with Binary Coefficients. Estimating Squared Error and a Phase Transition (Slides, Poster, 20mins video)
    Proceedings of the Conference on Learning Theory (COLT), 2017 (20 minutes Presentation)
    (b) Sparse High-Dimensional Linear Regression. Algorithmic Barriers and a Local Search Algorithm
    arXiv preprint, 2017
  41. Mixed integer convex representability (Slides)
    Mathematics of Operations Research, 2021
    Conference version in Proceedings of the International Conference of Integer Programming and Combinatorial Optimization (IPCO), 2017
    Miles Lubin, Juan Pablo Vielma, Ilias Zadik.
  42. Pre-2017 (complex analysis):
  43. Universal Padé approximants and their behaviour on the boundary
    Monatshefte für Mathematik, Vol. 182, p.p. 173–193, 2017
    Ilias Zadik.
  44. Pade approximants, density of rational functions in A^(infinity)(V) and smoothness of the integration operator
    Journal of Mathematical Analysis and Applications; Vol. 423, p.p. 1514–1539, 2015
    Vassili Nestoridis, Ilias Zadik.
Thesis/Notes/Survey Articles
Service
  • From Spring 2020 to Spring 2021 I had the pleasure to be among the organizers of the virtual MaD+ seminar, which run during the COVID-19 pandemic.
  • I have served as a reviewer for Annals of Statistics, Operations Research, Probability Theory and Related Fields, Mathematical Programming, SIAM Journal of Discrete Mathematics, SIAM Journal of Optimization, Combinatorica, IEEE Journal on Selected Areas in Information Theory, and for the conferences COLT, NeurIPS, FOCS, STOC, ITCS, ISIT, ICALP and SODA.
  • Program Committees: COLT 2023, COLT 2022 and COLT 2021.
Past teaching
    Instructor
  • Spring 2024, S&DS 351 01 Stochastic Processes Canvas Link
  • Fall 2023, S&DS 688 01 S&DS 688 01: Computational and statistical trade-offs in high dimensional statistics
    Newly designed state-of-the-art reseerch level course on computational-statistical trade-offs (see link for scribe notes and details).
  • Fall 2020, DS-GA 1005, (NYU): Inference and Representation. (Co-instructor: Joan Bruna)
    Co-designed and taught an advanced graduate-level course on modern theoretical aspects of statistics and machine learning.
  • Fall 2019, DS-GA 1002 (NYU): Probability and Statistics for Data Science. (Co-instructor: Carlos Fernandez-Granda )
    Introductory graduate-level course on probability and statistics.
  • Teaching Assistant
  • Spring 2017, 6.265/15.070 (MIT): Modern Discrete Probability. (Instructors: Guy Bresler, Yury Polyanskiy)
    Advanced graduate-level course on discrete probability and modern applications to computer science.
  • Fall 2016, 6.436J/15.085J (MIT): Fundamentals of Probability. (Instructor: David Gamarnik )
    Introductory graduate-level course on probability.
Some awards
  • Honorable Mention for MIT Operations Research Center Best Student Paper Award, 2017
    Paper (no. 4 above): High-Dimensional Regression with Binary Coefficients. Estimating Squared Error and a Phase Transition.
  • Senior Scholarship from Trinity College, University of Cambridge, 2014.
    Awarded for achieving "distinction" (ranked 13th out of 247 students) in Part III examination in Mathematics.
  • The Onassis Foundation Scholarship for Master Studies, 2013-2014
    Partially supported my Master studies in the University of Cambridge.
  • The Cambridge Home and European Scholarship Scheme (CHESS) award, 2013-2014.
    Partially supported my Master studies in the University of Cambridge.
  • International Mathematics Competition for University Students (IMC)
    First Prize, 2011, Second Prize, 2010.
  • South Eastern European Mathematics Olympiad for University Students (SEEMOUS)
    Gold Medal (first place, scored 39.5/40), 2011, Silver Medal, 2010.