theory retreat

Bio

I am a PhD student at MIT, advised by Prof. Konstantinos Daskalakis. I am a recipient of Facebook Research Fellowship for the year 2021. I received a Bachelor’s and a Master’s degree from the Technion, Israel, advised by Prof. Yuval Filmus.

Research interests

I am interested in theoretical computer science, especially in learning theory and complexity. My research is on learning from samples with intricate dependencies, learning with privacy and communication constraints and other topics in learning theory.

Contact information and C.V.

Email: dagan at mit.edu

C.V.

Research

Publications

  • Adversarial Laws of Large Numbers and Optimal Regret in Online Classification. Noga Alon, Omri Ben-Eliezer, Yuval Dagan, Shay Moran, Moni Naor, Eylon Yogev. STOC 2021 Link; Video
    Invited to special issue
  • Learning Ising Models from One or Multiple Sample. Yuval Dagan, Constantinos Daskalakis, Nishanth Dikkala, Anthimos-Vardis Kandiros. STOC 2021 Link; Video 1 (Daskalakis, 66 min)Video 2 (Kandiros, 22 min)
  • Majorizing Measures, Sequential Complexities, and Online Learning. Adam Block, Yuval Dagan, Sasha Rakhlin. COLT 2021 Link
  • Statistical Estimation from Dependent Data. Anthimos-Vardis Kandiros, Yuval Dagan, Nishanth Dikkala, Surbhi Goel, Constantinos Daskalakis. ICML 2021
  • A bounded-noise mechanism for Differential privacy. Yuval Dagan and Gil Kur. Link
    (Resolves an open problem from differentialprivacy.org)
  • Yuval Dagan and Vitaly Feldman. Interaction is necessary for distributed learning with privacy or communication constraints. STOC 2020. LinkVideo
    (Resolves a COLT 2019 open problem)
  • Yuval Dagan, Ohad Shamir. Detecting correlations with little memory and communication. Proceedings of the 31st Conference On Learning Theory, PMLR 75:1145-1198, 2018. LinkVideo
  • Yuval Dagan, Yuval Filmus, Daniel Kane, Shay Moran. The entropy of lies: playing twenty questions with a liar. ITCS 2021. LinkVideo
  • Yuval Dagan and Vitaly Feldman, PAC learning with stable and private predictions. COLT 2020 LinkVideo
  • Yuval Dagan, Gil Kur, Ohad Shamir. Space lower bounds for linear prediction in the streaming model. Conference On Learning Theory 2019. LinkVideo
  • Yuval Dagan, Constantinos Daskalakis, Nishanth Dikkala, Siddhartha Jayanti. Learning from weakly dependent data under Dobrushin’s condition. Conference on learning theory, 2019. LinkVideo
  • Dagan, Y., Filmus, Y., Gabizon, A., & Moran, S. (2017, June). Twenty (simple) questions. In Proceedings of the 49th Annual ACM SIGACT Symposium on Theory of Computing (pp. 9-21). ACM. LinkVideo (Filmus)
    Invited presentation in Highlights of Algorithms 2018
  • Dagan, Y., Filmus, Y., Hatami, H., & Li, Y. (2017, June). Trading information complexity for error. In 32nd Computational Complexity Conference (pp. 16:1-16:59). LinkVideo (Hatami)
  • Yuval Dagan, Coby Crammer. A better resource allocation algorithm with semi bandit feedback. Algorithmic learning theory, 2018.

Journal publications

  • Dagan, Y., Filmus, Y., Gabizon, A., Moran S. Twenty (short) questions. Combinatorica (2019) 39: 597. Link
  • Dagan, Y., Filmus, Y., Hatami, H., & Li, Y. (2018). Trading Information Complexity for Error. Theory OF Computing, 14(6), 1-73. (CCC 2017 special issue) Link

Other publications (yet to be published in a conference)

  • Yuval Dagan and Gil Kur. The Log-Concave Maximum Likelihood Estimator is Optimal in High Dimensions. Link
  • Yuval Dagan and Saar Zehavi. There is no online sparse regressor with sublinear regret. Project in the course “Advanced topics in Machine Learning” (Technion), fall 2015. (Resolves an open problem from COLT 2014)  Link

Teaching

  • Teaching tutorials
  • Making and checking exams and homework assignments

Courses:

  • Introduction to Machine Learning (winter semester 14/15)
  • Automata and Formal Languages (spring semester 14/15, winter semester 15/16)
  • Complexity Theory (spring semester 15/16)

Work experience

Google research internship (summer 2019)

Host: Vitaly Feldman.

Publications:

  • Yuval Dagan and Vitaly Feldman. Interaction is necessary for distributed learning with privacy or communication constraints. STOC 2020. Link.
  • Yuval Dagan and Vitaly Feldman, PAC learning with stable and private predictions. COLT 2020 Link

Facebook software internship (summer 2015)

Host: Rituraj Kirti

Summer internship in the Ads team. Improving the revenue using machine learning techniques.

Mellanox Technologies: student position (2013-2014)

Host: Liran Liss

A student position in the advanced development team in the Architecture Department. Roles:

  • Adding code to the driver and the user-space libraries of the network controller, written in C
  • Designing and implementing new features for the product
  • Designing and implementing an efficient algorithm of data integrity testing, for a complex randomized communication test, written in C++

Programming

Main languages: C/C++/python

Deep learning project, Technion (2017)

LSTMs / convolutional networks

Programming in Torch and TensorFlow