logoalt Hacker News

starfezzy01/20/20254 repliesview on HN

Can it solve easy problems yet? Weirdly, I think that's an important milestone.

Prompts like, "Give me five odd numbers that don't have the letter 'e' in their spelling," or "How many 'r's are in the word strawberry?"

I suspect the breakthrough won't be trivial that enables solving trivial questions.


Replies

msoad01/20/2025

> Give me five odd numbers that don't have the letter 'e' in their spelling

Compare the reasoning times!!! 84s vs 342s

R1 (Thought for 84 seconds)

      No odd number in English avoids the letter 'e' in its spelling. The request for five such numbers cannot be fulfilled.
o1 Pro (Thought for 5 minutes and 42 seconds)

      No standard English spelling of an odd number can avoid “e.” Every odd digit (one, three, five, seven, nine) already includes “e,” so once you build any odd number out of those digits, the spelled‐out form will contain “e.” As a result, there are no such odd numbers in standard English without an “e” in their name.
show 4 replies
diggan01/20/2025

> Can it solve easy problems yet? Weirdly, I think that's an important milestone.

Easy for who? Some problems are better solved in one way compared to another.

In the case of counting letters and such, it is not a easy problem, because of how the LLM tokenizes their input/outputs. On the other hand, it's really simple problem for any programming/scripting language, or humans.

And then you have problems like "5142352 * 51234" which is trivial problems for any basic calculator, but very hard for a human or a LLM.

Or "problems" like "Make a list of all the cities that had celebrity from there who knows how to program in Fortan", would be a "easy" problem for a LLM, but pretty much a hard problem anything else than Wikidata, assuming both LLM/Wikidata have data about it in their datasets.

> I suspect the breakthrough won't be trivial that enables solving trivial questions.

So with what I wrote above in mind, LLMs already solve trivial problems, assuming you think about the capabilities of the LLM. Of course, if you meant "trivial for humans", I'll expect the answer to always remain "No", because things like "Standing up" is trivial for humans, but it'll never be trivial for a LLM, it doesn't have any legs!

show 3 replies
salviati01/20/2025

I would argue anything requiring insights on spelling is a hard problem for an LLM: they use tokens, not letters. Your point still stands, but you need different examples IMO.

danielmarkbruce01/20/2025

There is no breakthrough required, it's trivial. It's just that by making a model do that, you'll screw it up on several other dimensions.

Asking a question like this only highlights the questioners complete lack of understanding of LLMs rather than an LLMs inability to do something.