logoalt Hacker News

llIIllIIllIIltoday at 6:02 AM0 repliesview on HN

I guess it depends on how people interact with LLM. Cognitive debt may be acquired when people `talk` with machines, asking personal questions, like asking what to answer to the sms from a friend, etc.

It may seem different when people `command` LLMs to do particular actions. At the end, this community, most of all probably, understands that LLM is nothing else than advanced auto-complete with natural language interface instead of Bash.

> Write me an essay about birds in my area

Than later will be presented as human’s work compared to

> How does this codebase charge customers?

When a person needs to add trials to the existing billing.

The latter will result a deterministic code after (many) prompts that a person will be able to validate for correctness (another question if they will though).