logoalt Hacker News

og_kalu05/15/20251 replyview on HN

It's definitely a tired and semantical one because as he said, it brings no insight and is not even good at the analogy level. I can't have a conversation with Dracula and Dracula can't make decisions that affect the real world, so LLMs already break key aspects and assumptions of the 'Document Simulator'.

Pre-trained LLMs will ask clarifying questions just fine. So I think this is just another consequence of post-training recipes.


Replies

Terr_05/15/2025

> Dracula can't make decisions that affect the real world, so LLMs already break key aspects and assumptions of the 'Document Simulator'.

Nonsense, we are already surrounded by mindless algorithms (and their outputs) that "affect the real world" because many of us have full-time jobs ensuring it happens! "

When someone uses a SimCity-esque program to generate a spreadsheet used for real-world bus schedules, does that "break key aspects and assumptions of a traffic simulator"? Does the downstream effect elevate it to a microcosm of tiny lives? Nope!

show 1 reply