logoalt Hacker News

quickthrowmanyesterday at 8:02 PM1 replyview on HN

> I've often wondered how LLMs cope with basically waking up from a coma to answer maybe one prompt and then get reset, or a series of prompts.

The same way a light fixture copes with being switched off.


Replies

pixl97yesterday at 8:05 PM

Oh, these binary one layer neural networks are so useful. Glad for your insight on the matter.

show 1 reply