LLMs absolutely body-slammed SO, but anyone who was an active contributor knows the company was screwing over existing moderators for years before this. Writing was on the walls
If by "body-slammed" you mean "trained on SO user data while violating the terms of the CC BY-SA license", then sure.
In the best case scenario, LLMs might give you the same content you were able to find on SO. In the common scenario, they'll hallucinate an answer and waste your time.
What should worry everyone is what system will come after LLMs. Data is being centralized and hoarded by giant corporations, and not shared publicly. And the data that is shared is generated by LLMs. We're poisoning the well of information with no fallback mechanism.
If by "body-slammed" you mean "trained on SO user data while violating the terms of the CC BY-SA license", then sure.
In the best case scenario, LLMs might give you the same content you were able to find on SO. In the common scenario, they'll hallucinate an answer and waste your time.
What should worry everyone is what system will come after LLMs. Data is being centralized and hoarded by giant corporations, and not shared publicly. And the data that is shared is generated by LLMs. We're poisoning the well of information with no fallback mechanism.