No, I put them with lmgtfy. You are being told that your question is easy to research and you didn't do the work, most of the time.
Also heaven forbid, AI can be right. I realize this is a shocker to many here. But AI has use, especially in easy cases.
I don't think LLM responses mean a question is easy to research - they will always give an answer.
1.) They are not replies to people asking questions.
2.) Posting AI response has as much value as posting random reddit comment.
3.) AI has value where you are able to factually verify it. If someone asks a question, they do not know the answer and are unable to validate ai.
"I asked AI and it said" is far worse than lmgtfy (which is already rude) because it has zero value as evidence. AI can be right, but it's wrong often enough that you can't actually use it to determine the truth of something.