About me
I am a postdoctoral researcher working at MIT, Mathematics Department, with Prof. Elchanan Mossel and Prof. Nike Sun. as a member of the NSF/Simons program Collaboration on the Theoretical Foundations of Deep Learning.
Research Interests
My research lies broadly in the interface of high dimensional statistics, the theory of machine learning and computation and applied probability. A lot of my work has the goal to build and use mathematical tools to bring insights into the computational and statistical challenges of modern machine learning tasks.
Four directions that I have been recently focusing on are:
- Computational-statistical trade-offs in inference (see papers 4, 11, 13, 17, 22, 24, 25 below).
- (Sharp) statistical phase transitions (the "All-or-Nothing phenomenon") (see papers 10, 12, 14, 18, 22 below).
- The power of lattice-based methods in inference (see papers 7, 19, 23 below).
- The cost of (differential) privacy in statistics (see papers 6, 15 below).
Short Bio (prior to current position)
From September 2019 to August 2021 I was a CDS
Moore-Sloan (postdoctoral) fellow at the
Center for Data Science of
New York University and a member of it's
Math and Data (MaD) group.
I received my PhD on September 2019 from the
Operations Research Center of
Massachussets Institute of Technology (MIT) , where I was very fortunate to be advised by Prof.
David Gamarnik. A copy of my PhD thesis can be found
here.
From June 2017 to August 2017 I was an intern at the
Microsoft Research Lab in New England, mentored by
Jennifer Chayes and
Christian Borgs . Prior joining MIT, I completed a Master of Advanced Studies in Mathematics (Part III of the Mathematical Tripos) at the
University of Cambridge and a BA in Mathematics from the Mathematics Department at the
University of Athens.
Recent recorded talks
Research papers (published or under review)
(Note: the order of the authors is alphabetical, unless denoted by (*))
2022+
-
The Franz-Parisi Criterion and Computational Trade-offs in High Dimensional Statistics
Submitted
Afonso Bandeira, Ahmed El Alaoui, Sam Hopkins, Tselil Schramm, Alex Wein, Ilias Zadik.
-
Almost-Linear Planted Cliques Elude the Metropolis Process
Submitted
Zongchen Chen, Elchanan Mossel, Ilias Zadik.
-
Lattice-Based Methods Surpass Sum-of-Squares in Clustering
To appear in Conference on Learning Theory (COLT), 2022
Ilias Zadik, Min Jae Song, Alex Wein, Joan Bruna. (*)
-
Statistical and Computational Phase Transitions in Group Testing
To appear in Conference on Learning Theory (COLT), 2022
Amin Coja-Oghlan, Oliver Gebhard, Max Hahn-Klimroth, Alex Wein, Ilias Zadik.
-
Shapes and recession cones in mixed-integer convex representability
Mathematical Programming (Major Revisions)
Ilias Zadik, Miles Lubin, Juan Pablo Vielma. (*)
-
Stationary Points of Shallow Neural Networks with Quadratic Activation Function (30mins video by Eren - MIT MLTea)
Submitted
David Gamarnik, Eren C. Kızıldağ, Ilias Zadik.
2021
-
On the Cryptographic Hardness of Learning Single Periodic Neurons
Advances in Neural Information Processing Systems, (NeurIPS), 2021
Min Jae Song, Ilias Zadik, Joan Bruna. (*)
-
It was “all” for “nothing”: sharp phase transitions for noiseless discrete channels(18mins video - COLT)
Proceedings of the Conference on Learning Theory (COLT), 2021
Jonathan Niles-Weed, Ilias Zadik.
-
Group testing and local search: is there a computational-statistical gap? (2hrs video by Fotis - IAS)
Proceedings of the Conference on Learning Theory (COLT), 2021
Fotis Iliopoulos, Ilias Zadik.
-
Self-Regularity of Non-Negative Output Weights for Overparameterized Two-Layer Neural Networks
IEEE Transactions of Signal Processing, 2022+
Conference version in Proceedings of the International Symposium on Information Theory (ISIT), 2021
David Gamarnik, Eren C. Kızıldağ, Ilias Zadik.
2020
-
Optimal Private Median Estimation under Minimal Distributional Assumptions (10mins video by Manolis - NeurIPS spotlight)
Advances in Neural Information Processing Systems, (NeurIPS), 2020
Selected for a Spotlight Presentation (~5% of submitted papers).
Christos Tzamos, Manolis Vlatakis, Ilias Zadik.
-
The All-or-Nothing Phenomenon in Sparse Tensor PCA (Poster, ( 25mins video - BIRS workshop)
Advances in Neural Information Processing Systems, (NeurIPS), 2020
Jonathan Niles-Weed, Ilias Zadik.
-
Free Energy Wells and the Overlap Gap Property in Sparse PCA ( 25mins video - Simons workshop)
Communications on Pure and Applied Mathematics, 2022+
Conference version in Proceedings of the Conference of Learning Theory (COLT), 2020
Gèrard Ben Arous, Alex Wein, Ilias Zadik.
2019
-
The All-or-Nothing Phenomenon in Sparse Linear Regression (Slides, Poster)
Mathematics of Statistics and Learning, 2021
Conference version in the Proceedings of the Conference on Learning Theory (COLT), 2019
Galen Reeves, Jiaming Xu, Ilias Zadik.
-
The Landscape of the Planted Clique Problem: Dense Subgraphs and the Overlap Gap Property ( 1hr video - NYU Probability Seminar)
Annals of Applied Probability (Major Revisions)
David Gamarnik, Ilias Zadik.
-
All-or-Nothing Phenomena: From Single-Letter to High Dimensions
Proceedings of the International Workshop on Computational Advances in Multi-Sensor Adaptive Processing (CAMSAP), 2019
Galen Reeves, Jiaming Xu, Ilias Zadik.
-
Improved bounds on Gaussian MAC and sparse regression via Gaussian inequalities
Proceedings of the International Symposium on Information Theory (ISIT), 2019
Ilias Zadik, Christos Thrampoulidis, Yury Polyanskiy. (*)
-
A simple bound on the BER of the MAP decoder for massive MIMO systems
Proceedings of the International Conference on Acoustics, Speech, and Signal Processing (ICASSP), 2019
Christos Thrampoulidis, Ilias Zadik, Yury Polyanskiy. (*)
2018
-
Inference in High-Dimensional Linear Regression via Lattice Basis Reduction and Integer Relation Detection
IEEE Transactions of Information Theory, 2021
Conference version with David Gamarnik, in Advances in Neural Information Processing Systems, (NeurIPS), 2018
David Gamarnik, Eren C. Kızıldağ, Ilias Zadik.
-
Revealing Network Structure, Confidentially: Improved Rates for Node-Private Graphon Estimation (Slides, 1h video by Adam -Simons)
Proceedings of the Symposium on Foundations of Computer Science (FOCS), 2018
Christian Borgs, Jennifer Chayes, Adam Smith, Ilias Zadik.
-
Orthogonal Machine Learning: Power and Limitations (Slides, Poster, Code)
Proceedings of International Conference of Machine Learning (ICML), 2018 (20 minute Presentation)
Lester Mackey, Vasilis Syrgkanis, Ilias Zadik.
2017
-
Sparse High-Dimensional Linear Regression. Estimating Squared Error and a Phase Transition.
Annals of Statistics, 2022
David Gamarnik, Ilias Zadik.
This paper merges:
(a) High-Dimensional Regression with Binary Coefficients. Estimating Squared Error and a Phase Transition (Slides, Poster, 20mins video)
Proceedings of the Conference on Learning Theory (COLT), 2017 (20 minutes Presentation)
(b) Sparse High-Dimensional Linear Regression. Algorithmic Barriers and a Local Search Algorithm
arXiv preprint, 2017
-
Mixed integer convex representability (Slides)
Mathematics of Operations Research, 2021
Conference version in Proceedings of the International Conference of Integer Programming and Combinatorial Optimization (IPCO), 2017
Miles Lubin, Juan Pablo Vielma, Ilias Zadik.
Pre-2017 (complex analysis):
-
Universal Padé approximants and their behaviour on the boundary
Monatshefte für Mathematik, Vol. 182, p.p. 173–193, 2017
Ilias Zadik.
-
Pade approximants, density of rational functions in A^(infinity)(V) and smoothness of the integration operator
Journal of Mathematical Analysis and Applications; Vol. 423, p.p. 1514–1539, 2015
Vassili Nestoridis, Ilias Zadik.
Thesis/Notes/Survey Articles
-
Computational and Statistical Challenges in High Dimensional Statistical Models
PhD Thesis, Operations Research Center, Massachussets Institute of Technology, 2019
-
Private Algorithms Can Always Be Extended
Note on the extension of private algorithms
Christian Borgs, Jennifer Chayes, Adam Smith, Ilias Zadik.
-
Noise Sensitivity with applications to Percolation and Social Choice Theory
Part III Essay, 2014
Advised by Béla Bollobás
-
A Note on the Density of Rational Functions in Α∞ (Ω)
A complex analysis note on the density of rational functions
New Trends in Approximation Theory; vol 81, p.p. 27-35
Javier Falcó, Vassili Nestoridis, Ilias Zadik
Service
- From Spring 2020 to Spring 2021 I had the pleasure to be among the organizers of the virtual MaD+ seminar, which run during the COVID-19 pandemic.
- I have served as a reviewer for Annals of Statistics, Operations Research, Probability Theory and Related Fields, Mathematical Programming, SIAM Journal of Discrete Mathematics, SIAM Journal of Optimization, Combinatorica, IEEE Journal on Selected Areas in Information Theory, and for the conferences COLT, NeurIPS, FOCS, STOC, ITCS, ISIT, ICALP and SODA.
- I served in the Program Committee for COLT 2022 and COLT 2021.
Teaching
Instructor
- Fall 2020, DS-GA 1005, NYU: Inference and Representation. (Co-instructor: Joan Bruna)
Co-designed and taught an advanced graduate-level course on modern theoretical aspects of statistics and machine learning.
- Fall 2019, DS-GA 1002 (NYU): Probability and Statistics for Data Science. (Co-instructor: Carlos Fernandez-Granda )
Introductory graduate-level course on probability and statistics.
Teaching Assistant
- Spring 2017, 6.265/15.070 (MIT): Modern Discrete Probability. (Instructors: Guy Bresler, Yury Polyanskiy)
Advanced graduate-level course on discrete probability and modern applications to computer science.
- Fall 2016, 6.436J/15.085J (MIT): Fundamentals of Probability. (Instructor: David Gamarnik )
Introductory graduate-level course on probability.
Awards
-
Top 400 Reviewers Award for Neurips, 2019.
-
Honorable Mention for MIT Operations Research Center Best Student Paper Award, 2017
Paper: High-Dimensional Regression with Binary Coefficients. Estimating Squared Error and a Phase Transition.
-
Senior Scholarship from Trinity College, Cambridge University, 2014.
-
The Onassis Foundation Scholarship for Master Studies, 2013-2014
-
The Cambridge Home and European Scholarship Scheme (CHESS) award, 2013-2014.
-
International Mathematics Competition for University Students (IMC): First Prize, 2011, Second Prize, 2010.
-
South Eastern European Mathematics Olympiad for University Students (SEEMOUS): Gold Medal (first place), 2011, Silver Medal, 2010.