Because it's an outmoded cliche that never held much philosophical weight to begin with and doesn't advance the discussion usefully. "It's a stochastic parrot" is not a useful predictor of actual LLM capabilities and never was. Last year someone posted on HN a log of GPT-5 reverse engineering some tricky assembly code, a challenge set by another commentator as an example of "something LLMs could never do". But here we are a year later still wading through people who cannot accept that LLMs can, in a meaningful sense, "compute".
Because it's an outmoded cliche that never held much philosophical weight to begin with and doesn't advance the discussion usefully. "It's a stochastic parrot" is not a useful predictor of actual LLM capabilities and never was. Last year someone posted on HN a log of GPT-5 reverse engineering some tricky assembly code, a challenge set by another commentator as an example of "something LLMs could never do". But here we are a year later still wading through people who cannot accept that LLMs can, in a meaningful sense, "compute".