Everyone in the comments wants to blame JavaScript for the world’s ills. That is both stupid and grossly uninformed.
When you live next to a Google or Facebook data center some reality starts to set in. They easily consume most of the output of a single small urban power plant on their own. It’s nuts. I didn’t realize how nuts it is until someone explained it to me, in my part time job I work with a Houston based power company lawyer that specializes in contracts for data centers. I doubt those massive data centers are reliant on JavaScript.
As for JavaScript there is a simple solution that works wonders in every other industry: licensing and liability. The code is bad because the developers that write it are shit. That’s never going to change until businesses have a financial incentive to train for competence. All the wishful thinking about less JavaScript is just more virtue signaling.
Well, Google does a lot more than Web... so, it's hard to tell what you are seeing in those datacenters. It could be that specifically next to where you live is a GCP datacenter or whatever other division in Google that has nothing to do with Web (Google has its fingers in cyber-security, television, maps, cellular phones, general electronics, ML and much, much more...)
So, measuring their power consumption isn't going to indicate anything unless you know what exactly does that datacenter support.
The client-side efficiency problem is JavaScript; perhaps the server-side efficiency problem is Python?