logoalt Hacker News

K0balt11/09/20240 repliesview on HN

Catastrophic forgetting or “psychosis” seems to happen when I overtrain. It’s easy to make it happen to models that have been extensively tuned already, but the base models hold up much better. I’m pretty sure there is a point in the n-dimensional space where x discrete vectors with n dimensions stops encoding usefully distinct patterns.