logoalt Hacker News

dwa3592today at 6:34 PM1 replyview on HN

but it's a tricky question for LLMs; it shows that if it's not in the training set; LLMs could trip which kinda shows that the intelligence is not generalized yet.

I tried this with gemini - (i am trying(something(re(a(l(ly)c)r)a)z)((y)he)re)

and it tripped.


Replies

orbital-decaytoday at 7:08 PM

Intuitively this looks like an architectural artifact (like optical illusions in humans) or a natural property of learning rather than a lack of generalization. I have issues with your example too and have to count slowly to make sure.