logoalt Hacker News

eightysixfour04/03/20252 repliesview on HN

Language can carry tremendous amounts of context. For example:

> I want a modern navigation app for driving which lets me select intersections that I never want to be routed through.

That sentence is low complexity but encodes a massive amount of information. You are probably thinking of a million implementation details that you need to get from that sentence to an actual working app but the opportunity is there, the possibility is there, that that is enough information to get to a working application that solves my need.

And just as importantly, if that is enough to get it built, then “can I get that in cornflower blue instead” is easy and the user can iterate from there.


Replies

fourside04/03/2025

You call it context or information but I call it assumptions. There are a ton assumptions in that sentence that an LLM will need to make in order to take that and turn it into a v1. I’m not sure what resulting app you’d get but if you did get a useful starting point, I’d wager the fact that you chose a variation of an existing type of app helped a lot. That is useful, but I’m not sure this is universally useful.

show 3 replies
anonzzzies04/04/2025

But it doesn't 'carry context' ; it's just vague and impossible to implement what you have in mind. And that's the problem; You assume people live in your reality, I assume mine, LLMs have some kind of mix between us and we will get 3 very different apps, none of which will be useful from that line alone. I like that line to be expanded with enough context to have an idea what you actually need to have built and I am quite sure pseudocode (or actual code) will be much shorter than a rambling english description you can come up with; most of which (unless it's logic language) will have enough unambiguous context to implement.

So sure, natural language is great for spitballing ideas, but after that it's just guessing what you actually want to get done.