The basic least squares method for identifying linear systems has been extensively studied. Conditions for convergence involve issues about noise assumptions and behavior of the sample covariance matrix of the regressors. Lai and Wei proved in 1982 convergence for essentially minimal conditions on the regression matrix: All eigenvalues must tend to infinity, and the logarithm of the largest eigenvalue must not tend to infinity faster than the smallest eigenvalue. In this contribution we revisit this classical result with respect to assumptions on the noise: How much unstructured disturbances can be allowed without affecting the convergence? The answer is that the norm of these disturbances must tend to infinity slower than the smallest eigenvalue of the regression matrix.