 A Simple LearningAugmented Algorithm for Online Packing with Concave Objectives
Elena Grigorescu, YoungSan Lin, Maoyuan Song
[full] 2024
 LearningAugmented Algorithms for Online Linear and Semidefinite Programming
Elena Grigorescu, YoungSan Lin, Sandeep Silwal, Maoyuan Song, Samson Zhou
NeuIPS 2022 (Spotlight presentation) [full]
 Hardness of Maximum Likelihood Learning of DPPs
Elena Grigorescu, Brendan Juba, Karl Wimmer, Ning Xie
COLT 2022 [full]
 List Learning with Attribute Noise
Mahdi Cheraghchi, Elena Grigorescu, Brendan Juba, Karl Wimmer, Ning Xie.
AISTATS 2021 [full]
 Flipping out with Many Flips: Hardness of Testing kMonotonicity
Elena Grigorescu, Akash Kumar, Karl Wimmer.
Proceedings of RANDOM 2018.
[pdf]
 CommunicationEfficient Distributed Learning of Discrete Distributions
Ilias Diakonikolas, Elena Grigorescu, Jerry Li, Abhiram Natarajan, Krzysztof Onak, Ludwig Schmidt
Proceedings of NIPS 2017. Oral presentation. [conference]
 Testing kMonotonicity (The Rise and Fall of Boolean Functions)
Clément Canonne, Elena Grigorescu, Siyao Guo, Akash Kumar, Karl Wimmer.
Accepted to Theory of Computing, 2017. Preliminary version in the Proceedings of Innovations in Theoretical Computer Science (ITCS), 2017.
[pdf]
 AC0MOD2 Lower Bounds for the Boolean InnerProduct
Mahdi Cheraghchi, Elena Grigorescu, Brendan Juba, Karl Wimmer, Ning Xie.
Journal of Computer and System Sciences, 2018 (accepted). Preliminary version in Proceedings of ICALP 2016.
[conference]
[full]
We show superlinear bounds for the problem of computing the InnerProduct function on AC0 circuits with parity gates just above the input level.
These are the current stateoftheart bounds on the problem.
 Statistical Algorithms and a Lower Bound for Planted Clique
Vitaly Feldman, Elena Grigorescu, Lev Reyzin, Santosh Vempala, Ying Xiao.
J. ACM 2017. Preliminary version in Proceedings of STOC 2013.
[conference]
[full]
We introduce a framework for proving lower bounds on computational problems over distributions, based on defining a restricted class of algorithms called statistical algorithms.
For wellknown problems over distributions, we give lower bounds on the complexity of any statistical algorithm.
These include a nearly optimal lower bound for detecting planted bipartite clique distributions (or planted dense subgraph distributions) when the planted clique has small size.
 On Noise Tolerant Learning of Sparse Parities and Related Problems
Elena Grigorescu, Lev Reyzin, Santosh Vempala.
Proceedings of the International Conference
on
Algorithmic Learning Theory (ALT), 2011. [pdf]
We propose an improved algorithm for learning sparse parities over arbitrary distributions.
