At the risk of being pedantic, it's not AI that requires massive resources, chatgpt 3.x was trained on a few million dollars. The jump to trillions being table stakes happened because everyone started using free services and there was just too much money in the hands of these tech companies. Among other things.
There are so many chickens that are coming home to roost where LLMs was just the catalyst.
Yeah for some reason AI energy use is so overreported. Using chatgpt for query does not even use two order of magnitude less energy compared to toasting a bread. And you can eat bread untoasted too if you care about energy use.
[1]: https://epoch.ai/gradient-updates/how-much-energy-does-chatg...
> it's not AI that requires massive resources
no it really is. If you took away training costs, OpenAI would be profitable.
When I was at meta they were putting in something like 300k GPUs in a massive shared memory cluster just for training. I think they are planning to triple that, if not more.