Papers on learning models and variants


    Learning-Augmented Algorithms for Online Linear and Semidefinite Programming Elena Grigorescu, Young-San Lin, Sandeep Silwal, Maoyuan Song, Samson Zhou
    NeuIPS 2022 (Spotlight presentation) [full]
    Hardness of Maximum Likelihood Learning of DPPs Elena Grigorescu, Brendan Juba, Karl Wimmer, Ning Xie
    COLT 2022 [full]
    List Learning with Attribute Noise Mahdi Cheraghchi, Elena Grigorescu, Brendan Juba, Karl Wimmer, Ning Xie.
    AISTATS 2021 [full]
    Flipping out with Many Flips: Hardness of Testing k-Monotonicity Elena Grigorescu, Akash Kumar, Karl Wimmer.
    Proceedings of RANDOM 2018. [pdf]
    Communication-Efficient Distributed Learning of Discrete Distributions Ilias Diakonikolas, Elena Grigorescu, Jerry Li, Abhiram Natarajan, Krzysztof Onak, Ludwig Schmidt
    Proceedings of NIPS 2017. Oral presentation. [conference]
    Testing k-Monotonicity (The Rise and Fall of Boolean Functions) Clément Canonne, Elena Grigorescu, Siyao Guo, Akash Kumar, Karl Wimmer.
    Accepted to Theory of Computing, 2017. Preliminary version in the Proceedings of Innovations in Theoretical Computer Science (ITCS), 2017. [pdf]
    AC0-MOD2 Lower Bounds for the Boolean Inner-Product Mahdi Cheraghchi, Elena Grigorescu, Brendan Juba, Karl Wimmer, Ning Xie.
    Journal of Computer and System Sciences, 2018 (accepted). Preliminary version in Proceedings of ICALP 2016. [conference] [full]
    We show super-linear bounds for the problem of computing the Inner-Product function on AC0 circuits with parity gates just above the input level. These are the current state-of-the-art bounds on the problem.
    Statistical Algorithms and a Lower Bound for Planted Clique Vitaly Feldman, Elena Grigorescu, Lev Reyzin, Santosh Vempala, Ying Xiao.
    J. ACM 2017. Preliminary version in Proceedings of STOC 2013. [conference] [full]
    We introduce a framework for proving lower bounds on computational problems over distributions, based on defining a restricted class of algorithms called statistical algorithms. For well-known problems over distributions, we give lower bounds on the complexity of any statistical algorithm. These include a nearly optimal lower bound for detecting planted bipartite clique distributions (or planted dense subgraph distributions) when the planted clique has small size.
    On Noise Tolerant Learning of Sparse Parities and Related Problems Elena Grigorescu, Lev Reyzin, Santosh Vempala.
    Proceedings of the International Conference on Algorithmic Learning Theory (ALT), 2011. [pdf]
    We propose an improved algorithm for learning sparse parities over arbitrary distributions.