logoalt Hacker News

dualvariabletoday at 5:01 PM0 repliesview on HN

LLMs don't "think" or "understand" in any way. They aren't AGI. They're still just stochastic parrots.

Putting them in control of making decisions without humans in the loop is still pretty crazy.