logoalt Hacker News

hackernewdsyesterday at 6:27 PM2 repliesview on HN

> Writing code is the default behavior from pre-training

what does this even mean? could you expand on it


Replies

joaogui1today at 12:52 AM

During pre-training the model is learning next-token prediction, which is naturally additive. Even if you added DEL as a token it would still be quite hard to change the data so that it can be used in a mext-token prediction task Hope that helps

bongodongobobyesterday at 7:06 PM

He means that it is heavily biased to write code, not remove, condense, refactor, etc. It wants to generate more stuff, not less.

show 2 replies