logoalt Hacker News

RichardLaketoday at 10:14 AM2 repliesview on HN

That isn't learning, it can read things in its context, and generate materials to assist answering further prompts but that doesn't change the model weights. It is just updating the context.

Unless you are actually fine tuning models, in which case sure, learning is taking place.


Replies

simianwordstoday at 10:16 AM

i don't know why you think it matters how it works internally. whether it changes its weights or not is not important. does it behave like a person who learns a thing? yes.

if i showed a human a codebase and asked them questions with good answers - yes i would say the human learned it. the analogy breaks at a point because of limited context but learning is a good enough word.

show 1 reply