Home
Publications
Awards
Light
Dark
Automatic
Paper-Conference
Learning Constant-Depth Circuits in Malicious Noise Models
The seminal work of Linial, Mansour, and Nisan gave a quasipolynomial-time algorithm for learning constant-depth circuits (𝖠𝖢0) with …
Adam Klivans
,
Konstantinos Stavropoulos
,
Arsen Vasilyan
PDF
Cite
URL
The Power of Iterative Filtering for Supervised Learning with (Heavy) Contamination
Inspired by recent work on learning with distribution shift, we give a general outlier removal algorithm called
iterative polynomial …
Adam Klivans
,
Konstantinos Stavropoulos
,
Kevin Tian
,
Arsen Vasilyan
PDF
Cite
URL
Learning Neural Networks with Distribution Shift: Efficiently Certifiable Guarantees
We give the first provably efficient algorithms for learning neural networks with distribution shift. We work in the Testable Learning …
Gautam Chandrasekaran
,
Adam Klivans
,
Lin Lin Lee
,
Konstantinos Stavropoulos
PDF
Cite
URL
Testing Noise Assumptions of Learning Algorithms
We pose a fundamental question in computational learning theory: can we efficiently test whether a training set satisfies the …
Surbhi Goel
,
Adam Klivans
,
Konstantinos Stavropoulos
,
Arsen Vasilyan
PDF
Cite
URL
Learning Noisy Halfspaces with a Margin: Massart is No Harder than Random
We study the problem of PAC learning $\gamma$-margin halfspaces with Massart noise. We propose a simple proper learning algorithm, the …
Gautam Chandrasekaran
,
Vasilis Kontonis
,
Konstantinos Stavropoulos
,
Kevin Tian
PDF
Cite
URL
Tolerant Algorithms for Learning with Arbitrary Covariate Shift
We study the problem of learning under arbitrary distribution shift, where the learner is trained on a labeled set from one …
Surbhi Goel
,
Abhishek Shetty
,
Konstantinos Stavropoulos
,
Arsen Vasilyan
PDF
Cite
URL
Efficient Discrepancy Testing for Learning with Distribution Shift
A fundamental notion of distance between train and test distributions from the field of domain adaptation is discrepancy distance. …
Gautam Chandrasekaran
,
Adam Klivans
,
Vasilis Kontonis
,
Konstantinos Stavropoulos
,
Arsen Vasilyan
PDF
Cite
URL
Smoothed Analysis for Learning Concepts with Low Intrinsic Dimension
In the well-studied agnostic model of learning, the goal of a learner– given examples from an arbitrary joint distribution on …
Gautam Chandrasekaran
,
Adam Klivans
,
Vasilis Kontonis
,
Raghu Meka
,
Konstantinos Stavropoulos
PDF
Cite
URL
Learning Intersections of Halfspaces with Distribution Shift: Improved Algorithms and SQ Lower Bounds
Recent work of Klivans, Stavropoulos, and Vasilyan initiated the study of testable learning with distribution shift (TDS learning), …
Adam Klivans
,
Konstantinos Stavropoulos
,
Arsen Vasilyan
PDF
Cite
URL
Testable Learning with Distribution Shift
We revisit the fundamental problem of learning with distribution shift, in which a learner is given labeled samples from training …
Adam Klivans
,
Konstantinos Stavropoulos
,
Arsen Vasilyan
PDF
Cite
URL
»
Cite
×