logoalt Hacker News

sega_sailast Sunday at 9:29 PM3 repliesview on HN

The least squares and pca minimize different loss functions. One is sum of squares of vertical(y) distances, another is is sum of closest distances to the line. That introduces the differences.


Replies

anArbitraryOneyesterday at 12:50 AM

"...sum of squared distances to the line" would be a better description. But it also depends entirely on how covariance is estimated

CGMthrowawaylast Monday at 1:41 AM

That makes sense. Why does least squares skew the line downwards though (Vs some other direction)? Seems arbitrary

show 2 replies
ryang2718last Sunday at 10:04 PM

I find it helpful to view least as fitting the noise to a Gaussian distribution.

show 2 replies