Resource consumption of AI is unclear on two axes:
1) As other commenters have noted: raw numbers. In general, people are taking the resource consumption of new datacenters and attributing 100% of that to "because AI," when the reality is generally that while AI is increasing spend on new infrastructure, data companies are always spending on new infrastructure because of everything they do.
2) Comparative cost. In general, image synthesis takes between 80 and 300 times fewer resources (mostly electricity) per image than human creation does. It turns out a modern digital artist letting their CPU idle and screen on while they muse is soaking significant resources that an AI is using to just synthesize. Granted, this is also not an apples-to-apples comparison because the average AI flow generates dozens of draft images to find the one that is used, but the net resource effect might be less energy spent in total per produced image (on a skew of "more spent by computers" and "less by people").
Comparing humans with machines on resource use gives some seriously dystopian vibes.