I am a PhD student at MIT, advised by Konstantinos Daskalakis. I received a Bachelor’s and a Master’s degree from the Technion, Israel, under the supervision of Prof. Yuval Filmus.
I am interested in theoretical computer science, especially in learning theory and complexity. My recent works are on learning with computational constraints: limited memory or communication, computation under privacy constraints or learning with statistical queries.
Contact information and C.V.
Email: dagan at mit.edu
- Dagan, Y., Filmus, Y., Gabizon, A., Moran S. Twenty (short) questions. Combinatorica (2019) 39: 597. Link
- Dagan, Y., Filmus, Y., Hatami, H., & Li, Y. (2018). Trading Information Complexity for Error. Theory OF Computing, 14(6), 1-73. (CCC 2017 special issue) Link
- Yuval Dagan and Vitaly Feldman. Interaction is necessary for distributed learning with privacy or communication constraints. Accepted to STOC 2020. Link.
(Resolves a COLT 2019 open problem)
- Yuval Dagan, Ohad Shamir. Detecting correlations with little memory and communication. Proceedings of the 31st Conference On Learning Theory, PMLR 75:1145-1198, 2018. Link
- Yuval Dagan, Gil Kur, Ohad Shamir. Space lower bounds for linear prediction in the streaming model. Conference On Learning Theory 2019. Link
- Yuval Dagan, Constantinos Daskalakis, Nishanth Dikkala, Siddhartha Jayanti. Learning from weakly dependent data under Dobrushin’s condition. Conference on learning theory, 2019. Link
- Dagan, Y., Filmus, Y., Gabizon, A., & Moran, S. (2017, June). Twenty (simple) questions. In Proceedings of the 49th Annual ACM SIGACT Symposium on Theory of Computing (pp. 9-21). ACM. Link
Invited presentation in Highlights of Algorithms 2018
- Dagan, Y., Filmus, Y., Hatami, H., & Li, Y. (2017, June). Trading information complexity for error. In 32nd Computational Complexity Conference (pp. 16:1-16:59). Link
- Yuval Dagan, Coby Crammer. A better resource allocation algorithm with semi bandit feedback. Algorithmic learning theory, 2018.
Other publications (yet to be published in a conference)
- Yuval Dagan, Yuval Filmus, Daniel Kane, Shay Moran. The entropy of lies: playing twenty questions with a liar. Link
- Yuval Dagan and Gil Kur. The Log-Concave Maximum Likelihood Estimator is Optimal in High Dimensions. Link
- Yuval Dagan and Vitaly Feldman, PAC learning with stable and private predictions. Link
- Yuval Dagan and Saar Zehavi. There is no online sparse regressor with sublinear regret. Project in the course “Advanced topics in Machine Learning” (Technion), fall 2015. (Resolves an open problem from COLT 2014) Link
- Teaching tutorials
- Making and checking exams and homework assignments
- Introduction to Machine Learning (winter semester 14/15)
- Automata and Formal Languages (spring semester 14/15, winter semester 15/16)
- Complexity Theory (spring semester 15/16)
Google research internship (summer 2019)
Host: Vitaly Feldman.
- Uniformly stable algorithms: we study the sample complexity required for the existence of uniformly stable algorithms. We prove tight bounds, and show that it is possible to obtain epsilon-optimal and epsilon-accurate with approximately VC/\epsilon^2 samples (up to poly-log factors).
- We study the sample complexity required to give differentially private predictions, and show results which are tight in some of the parameters.
- Other work (yet to be published).
Facebook software internship (summer 2015)
Host: Rituraj Kirti
Summer internship in the Ads team. Improving the revenue using machine learning techniques.
Mellanox Technologies: student position (2013-2014)
Host: Liran Liss
A student position in the advanced development team in the Architecture Department. Roles:
- Adding code to the driver and the user-space libraries of the network controller, written in C
- Designing and implementing new features for the product
- Designing and implementing an efficient algorithm of data integrity testing, for a complex randomized communication test, written in C++
Main languages: C/C++/python
Deep learning project, Technion (2017)
LSTMs / convolutional networks
Programming in Torch and TensorFlow