The lasso has been studied extensively for estimating the coefficient vector in the high-dimensional linear model; however, considerably less is known about estimating the error variance. Indeed, most theoretical properties of the lasso, including recent advances in selective inference with the lasso, are established under the assumption that the underlying error variance is known. In this paper, we propose the natural lasso estimator for the error variance, which maximizes a penalized likelihood objective. A key aspect of the natural lasso is that the likelihood is expressed in terms of the natural parameterization of the multiparameter exponential family of a Gaussian with unknown mean and variance. The result is a remarkably simple estimator with provably good performance in terms of mean squared error. These theoretical results do not require placing any assumptions on the design matrix or the true regression coefficients. We also propose a companion estimator, called the organic lasso, which theoretically does not require tuning of the regularization parameter. Both estimators do well compared to preexisting methods, especially in settings where support recovery is hard.