We study the problem of PAC learning $\gamma$-margin halfspaces with Massart noise. We propose a simple proper learning algorithm, the Perspectron, that has sample complexity $\tilde{O}((\epsilon \gamma)^{-2})$ and achieves classification error at most η+ ϵ where η is the Massart noise rate. Prior works [DGT19, CKMY20] came with worse sample complexity guarantees (in both $\epsilon$ and $\gamma$) or couldonly handle random classification noise [DDK+23, KIT+23] — a much milder noise assumption. We also show that our results extend to the more challenging setting of learning generalized linear models with a known link function under Massart noise, achieving a similar sample complexity to the halfspace case. This significantly improves upon the prior state-of-the-art in this setting due to [CKMY20], who introduced this model.