I completely agree that we don't know enough, but I suggest that that entails that the critics and those who want to be cautious are correct.
The harms engendered by underestimating LLM capabilities are largely that people won't use the LLMs.
The harms engendered by overestimating their capabilities can be as severe as psychological delusion, of which we have an increasing number of cases.
Given we don't actually have a good definition of "thinking" what tack do you consider more responsible?
> can be as severe as psychological delusion
Much worse, when insufficiently skeptical humans link the LLM to real-world decisions to make their own lives easier.
Consider the Brazil-movie-esque bureaucratic violence of someone using it to recommend fines or sentencing.
> The harms engendered by underestimating LLM capabilities are largely that people won't use the LLMs.
Speculative fiction about superintelligences aside, an obvious harm to underestimating the LLM's capabilities is that we could effectively be enslaving moral agents if we fail to correctly classify them as such.